Recently, the video client has been transplanted to the Android platform. Share your development experience.
Similar to HTTP live streaming stream download, however, Apple's TS format has too much redundant data. I use private format, which saves more bandwidth resources. H264 + amrnb encoding. Each multipart file takes 20 seconds. The overall architecture is as follows:
As the HTTP download module, it has high requirements on stability. If ndk is used, debugging will be very troublesome and the download efficiency on the Java layer will also be good. Therefore, the httpurlconnection class on the Java layer is used, the server contains the filename of the next file in the Response Header. During the next connection, the filename will be sent as a parameter of the URL. This enables streaming download. Create a bytearrayoutputstream instance to cache data. And passed to the file parsing layer through JNI.
Because Java and JNI layer data interaction is time-consuming, you can download a complete file and feed it to parser.
Audio must be smooth, So you can open a single thread. If you have time to video, you can use a timer (reduce power consumption) to open a thread.
H264 and AMR-NB are transplanted from opencore, the mobile phone with baseline enough. If you want to support AAC, you can use the AAC decoder of Helix or opencore AAC decoder.
The latency caused by JNI decoding and re-transmission to the display on the Java layer is intolerable. Therefore, you can directly render the image and sound on the ndk layer. You can refer to the practices in the android FFMPEG open-source project, or some netizens have summarized and extracted the Methods: http://www.cnblogs.com/mcodec /. The cost is portability, because the existing libjnivideo. So and libjniaudio. So versions are all in Android 2.2. Other versions will prompt "can not load libjnivideo. So,
It needs to be re-compiled in the android project. Code for FFMPEG and Android media framework can be downloaded: https://github.com/havlenapetr
It is found that the CPU usage increases sharply when the ndk uses usleep, so it is best not to use usleep for audio/video synchronization.
In the v880 test, when the edge network is relatively stable, 176*144, 10 frames, 48 k video bit rate, 7.95 K audio bit rate, can play smoothly. The following figure shows the real machine effect. (For the live video, the bit rate of feature phone is only 32 KB, and the screen is blurred)
The latest version beautifies the UI and provides a more stable and optimized playback kernel. Compatible with the 2.2 and 2.3-4.0 platforms, more than 2.3 use opengles + opensl es to render audio and video.
Supports SD (48-60 K bit rate, h264 + amrnb encoding), HD video (-K bit rate, h264 + AAC encoding ). For more information, see