Android Streaming Media Development Road two: NDK development Android Live streaming streaming program

Source: Internet
Author: User

NDK develops live streaming program for Android-side rtmp

After a toss-up, the success of the rtmp live streaming code, through the NDK cross-compiled way, ported to Android, thus realizing the Android side acquisition camera and mic seam data, then the H264 video encoding and AAC audio encoding, and sent to the RTMP server, To enable live Android camera. The program, called Ndkrtmpencoder, introduces the whole process, and the general framework, to the people who need it.

Development ideas

First of all, why use the NDK to do, because they have already implemented rtmp push stream, rtmp playback, RTSP transcoding and other C + + implementation of the streaming media projects, has a very mature code module. Since Android has the NDK, it can be used in a JNI way to re-use the previous mature code, greatly expand and accelerate the project implementation, then why not do it. As with other platforms, the following points need to be implemented to achieve the capture camera push live stream

    • Get Android Camera data
    • H264 Encoding of camera data
    • Coded data encapsulates data in RTMP protocol and pushes

The following separate development ideas:

    1. The Android side captures the camera raw data, can obtain the data through the Java layer through the CAMERA2, but also can use the Nativecamera through the NDK to obtain, but the latter need the version is higher, I thought about, or decided to obtain the data through the Java layer, and then handed over to the lower layer processing.
    2. H264 encoding, can be hardware encoding through ANDROIDMEDIACODEC, can also be x264 for software coding, because to reuse the previous code, decided to use software encoding to verify
    3. RTMP protocol encapsulation, this part of the code, directly using the previous C + + code, itself is platform-independent, NDK is also the development of Linux environment, socket network communications are interlinked. Specifically, you can refer to my previous article "C + + implementation RTMP protocol to send H. E encoding and AAC encoded audio and video"
Program Framework

According to my development idea, the framework of the program is obvious:

Part of the content is omitted here, for example, on the so dynamic library, there is a layer of encapsulation module for the activity call

    1. Java layer of the main do data collection. On the camera, through the Camera2 interface, get to the updated surface, and transferred to the OPENGL.EGL to draw, the data is drawn to the Textureview surfacetexture, while the RGB raw data back to the activity, the activity to transfer data to the dynamic library. For the Camera2 interface to get the camera data, you can refer to the previous article "Android Streaming media development: Camera2 capture the camera raw data and manually preview", the difference is that the article directly using the ImageReader surface, It's a custom surface.

    2. The C + + layer implements encoding the original data and encapsulates it according to the RTMP packet and then pushes it to the RTMP server. This section can refer to the previous article "C + + implementation RTMP protocol to send H. E encoding and AAC encoded audio and video".

Cross-compiling

This part is also one of the main tasks, C + + code to use on Android, must be compiled into a dynamic library, and then let the app through Jni to invoke. In essence, Android is also Linux, so with other embedded Arm-linux cross-compiling method, essentially the same, of course, if the system is arranged in a cross-compilation environment. Familiar with the NDK should know that Google provides a complete compiler toolchain, also includes the SDK, here: "NDK Downloads". I do it on Ubuntu Linux, so choose "Linux 64-bit (x86)" version, remember the Linux environment must be 64-bit , otherwise you can not compile anything.

After the decompression is actually ready to start. However, there are two ways to compile: the first is similar to other arm-linux environment, configure the cross-compiler tool chain environment, and then directly follow the normal Linux compilation method to compile The second is to write the android.mk file and compile it with the Ndk-build script provided in the NDK.

1. Tool Chain mode

The first way is actually relatively simple, after installing the cross-compiler tool chain, configure the environment, you can compile. such as the following configuration

This is basically the case, of course, different projects may need to be further modified configuration, make before the implementation of configure, etc., but generally.

2. Ndk-build mode

For Android.mk, the difference with makefile is very large, with its own syntax, its location throughout the compilation process, may be closer to the makefile.am in the Automake tool. For its syntax, see my mk file below, make some comments that can help understand, the specific syntax can refer to the official website of Android Developer. I am here to put my rtmp_enc_sdk.so dynamic library of the main content of the android.mk, we can make reference.

The pattern is basically the same, and according to this template, it is not difficult to modify it into your own project.

Key code

Whether Java layer or C + + layer of code in fact, but a few articles already have about their logical structure and implementation method of introduction, interested can refer to, according to the article written in the framework to understand, believe can be achieved. I'm going to paste the Java layer code that captures the logic of the camera's handling of the data later.

1 when Textureview is active, start creating the job. The first thing to do is to generate a OES surfacetexture, which is then passed to the Camera2 interface to receive the camera screen, then start creating the rtmp push module calling thread and creating the camera capture module, and the render module

2 When the oestexture screen is valid, get the actual resolution of the camera screen, as well as the rotation matrix, picture rotation information, etc., packaged together, to Eglrender, notify the rendering module to render the screen

3 Rendering module After drawing the data, read the RGB raw data and callback, here to the RTMP send thread, call the dynamic library, complete the last H264 encoding, and push to the RTMP server, this is the C + + layer so dynamic library do things

Run effect

On the phone side rtmp push streaming screen:

Play rtmp live video on your PC with Flash:

Haibindev.cnblogs.com, please contact QQ for cooperation. (reprint please indicate author and source ~)

Android Streaming Media Development Road two: NDK development Android Live streaming streaming program

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.