x264 FFmpeg Codec

Source: Internet
Author: User
Tags constant

x264 FFmpeg Codec

Demo:https://github.com/wangzuxing/myffmpegh264h265yuvopengl

H. Encoding:
The Java-side camera preview data is MEDIACODEC encoded, sent to the JNI end, JNI calls x264 library/FFMPEG encoded, generated. 264 files

Decoding of H:
Java side of the camera preview data is MEDIACODEC encoded, sent to JNI end, JNI call ffmpeg for real-time decoding, decoding YUV data directly sent to Java for refresh display (open GL YUV rendering)

Java side:
MainActivity0:

static {
    ...
    System.loadlibrary ("x264");
    System.loadlibrary ("FFmpeg");
    ...
}

x264 encoding: Encode directly using the x264 encoding Library public
native void mp4cc (String h264_file);//The path to the file is written to public
native void Mp4ce (byte [] array, int length); Camera preview data is passed into JNI with the JNI end called x264 encoded and then written to the file public
native void Mp4ee ();     
...

FFmpeg + x264 encoding: Compile x264 into libx264.so, compile into ffmpeg, and then call encode by FFmpeg via unified Avcodec
//Use Avcodec_register_all, Avcodec_find_encoder, AVCODEC_ALLOC_CONTEXT3 and so on can directly produce the coding operation environment, the object method (Avcodeccontext->avcodec (Avframe avpacket)
//(of course, you can also use the method of building more level AVFORMAT_ALLOC_CONTEXT/AVFORMAT_ALLOC_OUTPUT_CONTEXT2 environment objects--avformatcontext-> Avstream->avcodeccontext->avcodec (Avframe avpacket))
//public native void MP4FPGCC according to actual needs
(String H264_file); 
public native void Mp4fpgce (byte[] array, int length);
public native void Mp4fpgee ();
...

FFmpeg + x264 decoding:
Call Avcodec_register_all, Avcodec_find_encoder, AVCODEC_ALLOC_CONTEXT3 directly

H264_file: Incoming. 264 file to decode, Out_file: Decoding output written to YUV file, can be used for testing, here mp4fpgdcc0 () only to initialize the relevant parameters, do not operate the file

The MEDIACODEC encoded data of camera preview is passed into JNI, which is decoded by JNI call FFmpeg, decoding YUV data directly to send Java to refresh display (open GL YUV rendering)

public native void Mp4fpgdee0 ();

JNI End:
Mp.c

Encode using the X264 library-x264 codec is roughly the same as X265

Jniexport void Jnicall java_com_example_mymp4v2h264_mainactivity0_mp4cc (jnienv *env, Jclass clz, jstring h264_file) {
    int m_bitrate; Const char* H264_file_n = (*env)->getstringutfchars (env, H264_file, NULL);
    H264s_file = fopen (H264_file_n, "WB");
       if (H264s_file = = NULL) {logi ("mp4cc Could not open audio file");
    Return

    } yuv_size = (width * height)/2; X264_param_default_preset (&m_param, "fast", "zerolatency");
    X264_param_default (&m_param); 
    M_param.rc.i_bitrate = (int) m_bitrate/1000;
    M_bitrate = 2000000;   M_param.rc.i_rc_method = X264_rc_abr; The parameter I_rc_method represents the bitrate control, CQP (constant mass), CRF (constant bitrate), ABR (average bitrate)//CQP is measured as the target of the quantization value, bitrate the output text
    Target, the CRF targets "visual quality". m_param.rc.i_vbv_max_bitrate= (int) ((m_bitrate*1.2)/1000); 
    Average bitrate mode, maximum instantaneous bitrate, default 0 (same as-B setting) m_param.rc.i_bitrate = (int) m_bitrate/1000;   M_param.i_threads = 1; Number of worker threads M_param.i_width = WIdth;
  M_param.i_height = height;
  M_param.i_fps_num = fps;
  M_param.i_fps_den = 1; M_param.i_bframe = 10; Number of B frames between two reference frames//M_PARAM.I_CSP = (CSP = = 17)? x264_csp_nv12:csp;//encoded bit stream CSP, only supports i420, color space settings M_param.i_keyint_max = 25;
    Set the interval for IDR keyframes m_param.b_intra_refresh = 1;        M_PARAM.B_ANNEXB = 1;  Each NAL unit is preceded by a four-byte prefix//m_param.b_repeat_headers = 1; Repeat the Sps/pps in front of the keyframe X264_param_apply_profile (&m_param, "baseline");

    Profile "Baseline" (baseline, Main, high code) Encoder = X264_encoder_open (&m_param);    X264_encoder_parameters (encoder, &m_param); Set encoder parameters X264_picture_alloc (&pic_in, x264_csp_i420, width, height);

    Set the encoder video input format to I420 Yuv_buffer = (uint8_t *) malloc (yuv_size);
    PIC_IN.IMG.I_CSP = x264_csp_i420;

  Pic_in.img.i_plane = 3;
  Pic_in.img.plane[0] = Yuv_buffer;
  PIC_IN.IMG.PLANE[1] = pic_in.img.plane[0] + width * height;

  PIC_IN.IMG.PLANE[2] = pic_in.img.plane[1] + width * HEIGHT/4; Run_ce = 0;

  i_pts = 0;
  * Gets the maximum number of frames allowed for caching.
  int imaxframes = x264_encoder_maximum_delayed_frames (encoder);

  Logi ("Mp4ce imaxframes =%d", imaxframes);
  * Gets the number of buffered frames in the X264.

    int iFrames = X264_encoder_delayed_frames (Px264handle);
(*env)->releasestringutfchars (env, H264_file, H264_file_n);
} int nal_n; Jniexport void Jnicall java_com_example_mymp4v2h264_mainactivity0_mp4ce (jnienv *env, Jclass clz, JbyteArray data, Jint s
    ize) {unsigned char *buf = (unsigned char *) (*ENV)->getbytearrayelements (env, data, jni_false);

    memcpy (Yuv_buffer, buf, yuv_size);
    nnal = 0;
    Nal_n = 0;
    pic_in.i_pts = i_pts++;

    X264_encoder_encode (Encoder, &nals, &nnal, &pic_in, &pic_out);
    x264_nal_t *nal;
        for (nal = nals; nal < nals + nnal; nal++) {nal_n++;
    Fwrite (Nal->p_payload, 1, nal->i_payload, h264s_file);
    } run_ce++;
    Logi ("Mp4ce%d%d", run_ce, nnal); (*env)Releasebytearrayelements (env, data, (Jbyte *) buf, 0);
        } jniexport void Jnicall java_com_example_mymp4v2h264_mainactivity0_mp4ee (jnienv *env, Jclass clz) {while (1) {
        Int J;
        int ret = X264_encoder_encode (encoder, &nals, &nnal, NULL, &pic_out);
        if (ret<=0) {break;
        } for (j=0; J < Nnal; J + +) {fwrite (nals[j].p_payload, 1, nals[j].i_payload, h264s_file);

  }} logi ("Mp4ee end");
  X264_picture_clean (&pic_in);
  X264_picture_clean (&pic_out);
  X264_encoder_close (encoder);
  Fclose (H264s_file);
Free (yuv_buffer);
}//Encode Avcodec *ptrcodec using FFmpeg's libx264 library (unified invocation as AVCODEC within ffmpeg);
Avcodeccontext *pctx= NULL;
FILE *PTRF;
FILE *ptrfo;
Avframe *ptrframe;
Avpacket avpkt;

uint8_t endcode[] = {0, 0, 1, 0xb7};
Jniexport void Jnicall JAVA_COM_EXAMPLE_MYMP4V2H264_MAINACTIVITY0_MP4FPGCC (jnienv *env, Jclass clz, jstring h264_file) {Const char* H264_file_n = (*env)->getstringutfchars (env, H264_file, NULL);
        The file path written by the parameter encoding filename = H264_file_n;
        int ret;

      int codec_id = av_codec_id_h264;

        Register all codec Avcodec_register_all ();

        Logi ("MP4FPGCC%s", filename);
        Find Avcodec Ptrcodec = Avcodec_find_encoder (codec_id) according to Avcodecid;
            if (!ptrcodec) {Logi ("Codec not found\n");
        Exit (1);
        }//Create Avcodeccontext Pctx = AVCODEC_ALLOC_CONTEXT3 (ptrcodec) according to Avcodec;
            if (!pctx) {Logi ("Could not allocate video codec context\n");
        Exit (1);
        } int bitrate = 1000;
        int br = 1000*1000;
        int fps = 25;       
      Pctx->codec_type = Codec_type_video;
        /* pctx->bit_rate = BR;
        Pctx->rc_min_rate =BR;
        Pctx->rc_max_rate = BR;
        Pctx->bit_rate_tolerance = BR;
        pctx->rc_buffer_size=br; Pctx-&gT;rc_initial_buffer_occupancy = pctx->rc_buffer_size*3/4;
        pctx->rc_buffer_aggressivity= (float) 1.0;
      Pctx->rc_initial_cplx= 0.5f;

        *///av_opt_set (pctx->priv_data, "CRF", "1", Av_opt_search_children);
        Pctx->bit_rate = bitrate * 1000; Pctx->bit_rate_tolerance = 2000000;

        Indicates how many bits of the video stream can be offset from the current setting. The "setting" here refers to the CBR or VBR.
        If Bit_rate is not set, the CRF parameter Pctx->width = width is used;
        Pctx->height = height;
        Pctx->time_base.den = fps;
        Pctx->time_base.num = 1;    Pctx->gop_size = fps;
        * 10;
        Pctx->refs = 3;  Pctx->max_b_frames = 3;

        The number of B-frames allowed between two non-B frames, 0 means that no B-frames are used, the more B-frames, the smaller the picture//pctx->trellis = 2;
        Pctx->me_method = 8;

        Pctx->me_range = 64;//16;
        pctx->me_subpel_quality = 7;        Pctx->qmin = 10;
        Between 0~31, the lower the value, the finer the quantization, the higher the image quality, and the longer the resulting code flow.
        Pctx->qmax = 51; Pctx->rc_initial_buffer_occUpancy = 0.9; Pctx->i_quant_factor = 1.0/1.40f; The quantization factor ratio of p and I, the closer the 1,p frame to the clearer, the more optimized, the quantization factor of P = The quantization factor of the I-frame * i_quant_factor + i_quant_offset//X4->params.rc.f_ip_factor

        = 1/fabs (Avctx->i_quant_factor);       Pctx->b_quant_factor = 1.30f; Represents the Q-value scale factor of i/p and B, the larger the value, the more severe the B frame deterioration, the higher the quantization coefficient Q scale factor between the I-frame, the P-frame, and the B-frame, the larger the value, the less clear the B-frame quantization factor = The quantization coefficient of the previous P-Frame Q * b_quant_factor + b_quant

        _offset//pctx->chromaoffset = 0;
        Pctx->max_qdiff = 4; pctx->qcompress = 0.6f; Floating-point values, which indicate the range of variations that allow the ratio of Q values to be suppressed when suppressing "easy-to-press scenes" and "hard-pressed scenes".
        The optional value is 0.0-1.0.     Pctx->qblur = 0.5f;

        Floating point number, indicating the ratio of Q value with time to reduce the extent of the range is 0.0-1.0, take 0 is not to reduce pctx->pix_fmt = av_pix_fmt_yuv420p;
        Pctx->scenechange_threshold = 40;
        Pctx->flags |= Codec_flag_loop_filter;
        pctx->me_cmp = Ff_cmp_chroma;
        PCTX-&GT;FLAGS2 |= CODEC_FLAG_NORMALIZE_AQP;

        Pctx->keyint_min = 25; pctx->rc_qsquish=1.0//Use the ratio of Qmin/qmax to limit and control the rate of the method. Select 1 to indicate a local (that is, aClip) Using this method, choose 1 for all.
        Pctx->level = 30;
        Pctx->b_frame_strategy = 2;

        Pctx->codec_tag = 7;
        /*/encoder preset Avdictionary *dictparam = 0;
           if (pcodecctx->codec_id = = av_codec_id_h264) {av_dict_set (&dictparam, "preset", "Medium", 0);
           Av_dict_set (&dictparam, "tune", "Zerolatency", 0);
        Av_dict_set (&dictparam, "Profile", "main", 0); } */if (codec_id = = av_codec_id_h264) {//av_opt_set (Pctx->priv_data, "preset", "slow", 0 ); Ultrafast av_opt_set (pctx->priv_data, "preset", "ultrafast", 0);
            Ultrafast av_opt_set (pctx->priv_data, "tune", "Zerolatency", 0);
        Av_opt_set (Pctx->priv_data, "Profile", "main", 0); }/* Open it */if (Avcodec_open2 (Pctx, Ptrcodec, NULL) < 0) {Logi ("Could Not open code
            C\n ");
        Exit (1); } PTRF = fopen (FilenaMe, "WB");
            if (!PTRF) {Logi ("Could not open%s\n", filename);
        Exit (1);
        } ptrframe = Av_frame_alloc ();
            if (!ptrframe) {Logi ("Could not allocate video frame\n");
        Exit (1);
        } Ptrframe->format = pctx->pix_fmt;
        Ptrframe->width = pctx->width;

        Ptrframe->height = pctx->height; /* The image can be allocated by any means and av_image_alloc () are * just the most convenient if Av_malloc ()
                IS-to-be used */ret = Av_image_alloc (Ptrframe->data, Ptrframe->linesize, Pctx->width, Pctx->height,
        PCTX-&GT;PIX_FMT, 32);
            if (Ret < 0) {Logi ("Could not allocate raw picture buffer\n");
        Exit (1); 
        }//Allocate buffer picture_size= according to I420 format pctx->width*pctx->height*3/2;

        Picture_buf = (uint8_t *) Av_malloc (picture_size); int y_size = PCTX-&GT;WIDTH*PCtx->height;

      Logi ("W =%d, H =%d, picture_size=%d, y_size =%d\n", Pctx->width, Pctx->height, Picture_size, y_size);              I420 format data ptrframe->data[0] = Picture_buf;      Y ptrframe->data[1] = picture_buf+ y_size;  U ptrframe->data[2] = picture_buf+ Y_SIZE*5/4;
        V Av_init_packet (&AMP;AVPKT);    Avpkt.data = NULL;

        Packet data is allocated by the encoder avpkt.size = 0;
        framecnt = 0;
        Ffmpeg_ce = 0;
        frame_pts = 0;
(*env)->releasestringutfchars (env, H264_file, H264_file_n);
} int Total_st;

int total_stream; Jniexport void Jnicall JAVA_COM_EXAMPLE_MYMP4V2H264_MAINACTIVITY0_MP4FPGCE (jnienv *env, Jclass clz, JbyteArray data, Ji
        NT size) {unsigned char *buf = (unsigned char *) (*ENV)->getbytearrayelements (env, data, jni_false);

        memcpy (Picture_buf, buf, picture_size);

        int got_picture=0;
Av_init_packet (&AMP;AVPKT);      Avpkt.data = NULL;

        Packet data is allocated by the encoder avpkt.size = 0; Video timestamp//pkt.pts = inc++ * (1000/fps);
        Inc has an initial value of 0, each time stamp Inc plus 1.

        pkt.pts = m_nvideotimestamp++ * (pctx->time_base.num * 1000/pctx->time_base.den);
        Encode int ret = Avcodec_encode_video2 (Pctx, &avpkt, Ptrframe, &got_picture);            if (Ret < 0) {Logi ("Mp4fpgce Failed to encode!
            ");
        return;  } if (got_picture) {//if (pctx->coded_frame->pts! = Av_nopts_value) {//avpkt.pts=
          Av_rescale_q (pctx->coded_frame->pts, Pctx->time_base, ost->st->time_base);
            }//pkt.stream_index = video_st0->index;
            avpkt.pts = FRAME_PTS*1000*/25;
            avpkt.pts = frame_pts* (Pctx->time_base.num*1000/pctx->time_base.den);

          frame_pts++; Fwrite (Avpkt.data, 1, avpkt.size, PTRF);
        Av_packet_unref (&AMP;AVPKT);
        } ffmpeg_ce++;
(*env)->releasebytearrayelements (env, data, (Jbyte *) buf, 0);              } jniexport void Jnicall java_com_example_mymp4v2h264_mainactivity0_mp4fpgee (jnienv *env, Jclass clz) {LOGI ("
    Mp4fpgee ");
    int got_output = 0;
    int ret = Avcodec_encode_video2 (Pctx, &avpkt, NULL, &got_output);
       if (Ret < 0) {Logi ("Error encoding frame\n");
    Exit (1);
       } if (got_output) {avpkt.pts = frame_pts* (Pctx->time_base.num*1000/pctx->time_base.den);

       frame_pts++;
       Fwrite (Avpkt.data, 1, avpkt.size, PTRF);
    Av_packet_unref (&AMP;AVPKT);
    }/* Add sequence end code to a real MPEG file */fwrite (Endcode, 1, sizeof (Endcode), PTRF);

    Fclose (PTRF);
    Avcodec_close (PCTX);
    Av_free (PCTX);
    Av_freep (&ptrframe->data[0]);

Av_frame_free (&ptrframe); }//decoding using FFmpeg's libx264 library (unified invocation as AVCODEC within FFmpeg), uint8_t* Packet_buf; Jniexport void Jnicall java_com_example_mymp4v2h264_mainactivity0_mp4fpgdcc0 (jnienv *env, Jclass clz, jstring outfile_ Name, jstring h264_file) {const char* H264_file_n = (*env)->getstringutfchars (env, H264_file, NULL);//h264_file stay

    Decode. 264 Document CONST char* H264_FILE_O = (*env)->getstringutfchars (env, Outfile_name, NULL);
    filename = H264_file_n;
    Outfilename = H264_file_o;
    int ret, I;
    int codec_id = av_codec_id_h264;

    Avcodec_register_all ();
    Find the MPEG1 video encoder ptrcodec = Avcodec_find_decoder (av_codec_id_h264);
         if (!ptrcodec) {Logi ("Codec not found\n");
    Exit (1);
  } Pctx = Avcodec_alloc_context3 (PTRCODEC);
        if (!pctx) {Logi ("Could not allocate video codec context\n");
    Exit (1); } if (Ptrcodec->capabilities & av_codec_cap_truncated) pctx->flags |= av_codec_flag_truncated; We don't send complete frames/* OPEN it */if (avcodec_open2(Pctx, Ptrcodec, NULL) < 0) {Logi ("Could not open codec\n");
  Exit (1);
    } ptrframe = Av_frame_alloc ();
        if (!ptrframe) {Logi ("Could not allocate video frame\n");
    Exit (1);

    } av_init_packet (&AMP;AVPKT);
    Frame_count = 0;
  Ffmpeg_ce = 0;

    frame_pts = 0;
    Packet_buf = (unsigned char *) malloc (320*240/2); (uint8_t *) Av_malloc (PCTX-&GT;WIDTH*PCTX-&GT;HEIGHT/2);

    PCTX-&GT;WIDTH*PCTX-&GT;HEIGHT*3/2 Logi ("mp4fpgdcc0 end \ n");
    (*env)->releasestringutfchars (env, H264_file, H264_file_n);
(*env)->releasestringutfchars (env, Outfile_name, h264_file_o); } jniexport void Jnicall java_com_example_mymp4v2h264_mainactivity0_mp4fpgdce0 (jnienv *env, jobject obj, JbyteArray dat
    A, jint size) {int len;
  unsigned char *buf = (unsigned char *) (*ENV)->getbytearrayelements (env, data, jni_false);

  memcpy (Packet_buf, buf, size);

  Got_frame = 0;
  Av_init_packet (&AMP;PKT);
  pkt.size = size; PkT.data = Packet_buf;
        while (Pkt.size > 0) {len = Avcodec_decode_video2 (Pctx, Ptrframe, &got_frame, &AMP;PKT);
            if (pkt.data) {pkt.size-= Len;
        Pkt.data + = Len;
            } if (Len < 0) {Logi ("Error while decoding frame%d\n", frame_count);
            Av_packet_unref (&AMP;PKT);
        Break } if (Got_frame) {if (frame_size==0) {frame_size = Ptrframe->width*ptrframe->heigh
                T
            frame_size_l = FRAME_SIZE*3/2; } if (!temp_store_a && frame_size>0) {temp_store_a = (unsigned char *) malloc (frame_s
                IZE*3/2);
            Logi ("Saving frame%d,%d,%d\n", Frame_count, Frame_size, frame_size_l); } pgm_save2 (Ptrframe->data[0], ptrframe->linesize[0], Ptrframe->width, Ptrframe->height, Temp_sto
            RE_A); Pgm_save2 (Ptrframe->data[1], ptrframe->linesize[1], PTRFRAME-&GT;WIDTH/2, PTRFRAME-&GT;HEIGHT/2, temp_store_a+frame_size); Pgm_save2 (Ptrframe->data[2], ptrframe->linesize[2], PTRFRAME-&GT;WIDTH/2, PTRFRAME-&GT;HEIGHT/2, temp_store_

            A+FRAME_SIZE*5/4); 
                if (method1==0) {//1. Find Java code class file//Jclass (*findclass) (jnienv*, const char*); Jclass Dpclazz = (*env)->findclass (env, "com/example/mymp4v2h264/mainactivity0"

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.