The mobile online streaming media project in the previous stage has finally come to an end. The following is a summary and accumulation of development: 1. online streaming media playing on mobile phones has many restrictions, including: 1) The network speed of the mobile phone is limited. GPRS is generally about 5 kb/s, and edge is about 10 kb/s; 2) The CPU frequency of mobile phones is generally not high, resulting in low decoding efficiency; 3) There are many mobile phone platforms (Symbian/PPC/Java/MTK ); 2. Because the network speed of mobile phones is limited, different versions should be made for different network speeds so that users can select the appropriate version for viewing based on their mobile phones and network environments. Of course, different bitstream videos have different resolutions, such as 176*144,176*208 and Q screens; 3. Generally, porting the mobile phone platform based on the open-source FFMPEG library can effectively improve the development progress; 4. Due to the low speed limit of the mobile phone, you need to reduce the video stream as much as possible. But at the same time, we want to get clear and smooth video playback effects, which is actually a contradiction. Currently, using h264 encoding/decoding is a good result, because the compression ratio of h264 is much higher than that of h263, and it is also a coding/decoding algorithm specially applied for mobile phones; 5. Due to the low CPU frequency of the mobile phone, while downloading video data, the mobile phone must perform real-time decoding and zoom in and out the video image, therefore, it is difficult for low-end mobile phones to obtain a sound playback experience. Therefore, we recommend that you only develop it on a relatively high-end mobile phone platform, such as Symbian/PPC (Windows mobie). The Java platform should be ignored; 6. Although the video is played online, it should also include the functions of the local player, including full screen, non-full screen, mute, fast forward and fast return, sound adjustment, pause, and replay; 7. The resolution of the video source to be played is fixed, for example, 176*208. If you play the video on a mobile phone with different resolutions, you may need to zoom in or out accordingly, linear interpolation or quadratic linear interpolation algorithms are generally used for amplification or reduction. Pay attention to the resources occupied by these amplification or reduction algorithms. If they are too high, you need to adjust the algorithms or use lightweight algorithms, ensures clear and smooth video playback; 8. A very important issue: the multi-thread or asynchronous method is used to ensure the continuity of streaming media data transmission. The details are as follows: Client <----> gateway <----> Streaming Media Server Request -------------------------> Response <------------------------- As shown in the preceding figure, each time the client sends video data to the Streaming Media Server and receives the video data, there is a time period, which is set to T1. During the T1 period, the client has no other operations except playing the video, and the network may be idle. Therefore, we need to make full use of this t1 time to send the next request at a reasonable time point within the T1 time so as to ensure that the video data returned by the server is continuous as much as possible, this makes full use of the entire network bandwidth for data transmission; Client <----> gateway <----> Streaming Media Server Request 1 -------------------------> Request 2 ----------------------> Response 1 <------------------------- Response 2 <---------------------- 9. If FFmpeg is used as the core container parsing and decoding engine, you can consider optimizing the kernel of FFMPEG for the particularity of mobile phones, such as adjusting the encoding, it can improve the decoding efficiency, improve the decoding efficiency on the mobile phone end, and improve the playing smoothness (The effect can reach more than 20%, and the details will be described later ); 10. Pay special attention to the issue of Audio and Video Synchronization. There are many differences between online video playback and local video playback. You need to consider how to ensure that audio and video can be synchronized under various exceptions, it is necessary to discard part of the frame data properly; 11. Basically, all network applications on mobile phones cannot bypass the mobile cmwap gateway. Cmwap gateway only develops standard HTTP interfaces. Therefore, all mobile phone network applications need to consider how to design the Protocol Based on HTTP, you can also consider how to use HTTP proxy to penetrate the cmwap gateway; 12. To better penetrate the cmwap gateway nationwide, it is recommended that the data packet size be around kb each time. If it is too large, cmwap gateways in some provinces may intercept it; (In fact, this issue also needs to be considered when the download tool on the mobile phone is used for multipart download) 13. For normal live TV broadcasting, a dedicated acquisition card is required to connect analog or digital signals and use the DirectShow framework to collect and synchronize sound and video data, encode, merge to container, and transmit (for example, MP4/MTK/WMV ); 14. For webpage-based collection, you can view one of my previous articles. Using the image drive of the video card data, you can capture video data in a specified area and capture sound through the sound card, then, audio and video files are synchronized and merged into containers (such as MP4, MTK, and WMV ); If the above problems can be taken into account and solved, this mobile streaming media product will have the conditions for success. |