The stream playback data source is from the isource interface object, which can be from the network, memory, or file. Streaming media has two types of content: one is a tabular media with the header, the starting position of the original data, such as A. Wav file, and the other is the original data. The encoding/decoding method is provided by the user separately. Streaming playback requires a specific implementation of isource. The application creates an isource interface, which remains valid throughout the entire lifecycle of the imedia interface. The following is a simple example of stream playback for a WAV file.
Static void myapp_setupsource (MyApp * PME ){
Aeemediadataex md;
Ifilemgr * PFM; isourceutil * PSU;
// Step #1: Create an imedia PCM object in the idle state
Ishell_createinstance (PME-> E. m_pishell, aeeclsid_mediapcm, (void **) & PME-> m_pimedia );
// Step #2: create a specific isource object
Ishell_createinstance (PME-> E. m_pishell, aeeclsid_filemgr, (void **) & PFM ))
PME-> m_pfile = ifilemgr_openfile (PFM, "sample.wav", _ ofm_read );
Ifilemgr_release (PFM );
Ishell_createinstance (PME-> E. m_pishell, aeeclsid_sourceutil, (void **) & PSU ))
Isourceutil_sourcefromastream (PSU, (iastream *) PME-> m_pfile, & PME-> m_pisource );
Isourceutil_release (PSU );
// Step #3: Initialize aeemediadataex with isource
Md. clsdata = mmd_isource;
Md. pdata = (void *) PME-> m_pisource;
Md. dwsize = 0;
Md. dwstructsize = sizeof (MD); // the size of the aeemediadataex Data Structure
Md. dwcaps = 0 ;.
Md. braw = false; // is the original data? False indicates no
Md. dwbuffersize = 0; // internal cache size. 0 indicates the default value.
Md. pspec = NULL; // only the original data format is limited
Md. dwspecsize = 0; // only the original data format is limited
// Step #4: Set media data. The imedia object is in ready state.
Imedia_setmediadataex (PME-> m_pimedia, & MD, 1 );
}
For stream playback of raw data, because there is no media play Terminator, you must call imedia_stop () accurately during playback (). In the aeemediadataex data structure, set braw to true and pspec to the specified encoding/decoding method.
Streaming media playback is stream playback from the network. streaming transmission is used. You do not need to download all the multimedia data, but download and play it, only download some initial data to the local buffer zone. After the data is accumulated to the continuous playback requirement, the playback starts. The subsequent data enters the local buffer zone as requested, in this way, a video clip can form a complete data stream. For example, this technology is used in the most common network TV PPLIVE. Due to restrictions on the wireless network, mobile streaming media generally uses a single playing mode. Each receiver establishes a one-to-one connection with the Streaming Media Server. Each user sends a data request to the server separately, the server sends a separate copy of data to the user.
Because the current API does not support interfaces in H.264 or MPEG-4/H.264 format, you need to port the corresponding decoder to the Brew platform. Transplantation mainly uses the brew interface to replace the C-language functions in the decoder, and replaces floating-point operations with integer or fixed-point calculations, especially H. 264 and Xvid refer to the issue that the stack space used by the source program exceeds the limit of brew mobile phone (for example, changing the array to dynamic memory allocation and changing the global array to the function domain ), the video frame is decoded as a bitmap and displayed on the mobile phone screen. An important problem for Video Stream playing on mobile phones is to solve the problem of audio and video synchronization. The data package sent by the server contains the playback duration of the audio and video. This allows you to adjust the playback duration based on the audio playback time. assume that HA is the playing time of the Current audio and HV is the playing time of the current video frame. if HV