Windows Media Foundation Learning note 3--Media playback

Source: Internet
Author: User
Tags file url

Skip the second chapter of the tool Topoedit introduction, direct media playback of the learning.

Media play this chapter, introduced a file to play the example, think of the original learning DShow time, the beginning is also a file playback example, but that example is relatively simple, not much code, NND, and this demo completely blind, on a simple file playback on so much code, hey.


The first thing about this chapter is that MF widely uses COM technology but is not really COM interface, that is, mf mixed COM and normal objects, using COM, it is very obvious to initialize COM.


The following is an introduction to the MF-related concepts used in the process of playing a file:

1. Initialize COM, initialize MF. (Special note: MF API has only Unicode version without ASCII version)


2. The introduction of an important conceptual media session, equivalent to DShow's graph Builder,media session, is primarily used to control media pipelines. For the example of this file playback, the media session internally samples from source (fetch data), then gives MFTs (transform component, decodes the component), and then renders it, and on the outside is a series of functions that control the playback pause and pass the internal events of MF, Very similar to DShow's graph.

A) Create Media session:mfcreatemediasession (NULL, &m_psession); The first parameter defaults, the second parameter is the newly created pointer to the media Session

b) Set the asynchronous event callback for media session: M_psession->begingetevent ((imfasynccallback*) this, NULL); The first parameter is the specified callback object, which needs to be inherited to Imfasynccallback, and the asynchronous event will callback to the Imfasynccallback interface's Invoke function, where the Invoke function is a pure virtual function in Imfasynccallback. Need to be implemented by inheritors.


3. The introduction of concept topology, network interpretation topology, in this demo is to establish a file playback topology, so-called topological structure is to resolve the need for component, in order to establish the final media pipeline. The following analysis topology the build process:

A) Create MF SOURCE:MF a component named source resolve, which resolves the appropriate container and data formats based on the file URL (which can also be streaming), and creates the corresponding source component.

One. Create source Resolve:mfcreatesourceresolver (&psourceresolver);

Two. Create the MF Source:
        Use the synchronous source resolver to create the media source.        hr = Psource  Resolver->createobjectfromurl (            sURL)                      //URL of the source.            Mf_resolution_mediasour CE |                 Mf_resolution_content_does_not_have_to_match_ Extension_or_mime_type                            and nbsp            //indicate that we want a source object, and         & nbsp                              //Pass in option Al Source Search parameters            NULL,             &NBSP ;       //Optional property store for extra parameters            &objecttype,   &N Bsp            //receives the created object type.            &AM P;psource                    //receives a pointer to the media source.& nbsp                ); The English note is detailed, and here is the second parameter, which indicates how to create and find a matching MF source,mf_resolution_mediasource with the flag flag: Create a media source;mf_resolution_ Content_does_not_have_to_match_extension_or_mime_type: This flag indicates that the media source match, in addition to the file extension and MIME TYPE, will also match other registered media sources if it fails. So media source has.

(It is also important to note that the above method is to build the MF source in a synchronous manner, and asynchronous way, mainly for the network stream media or other data sources that require a certain amount of time to access)


b) Build topology: First of all, the example topology is partial, partial, it just constructs the source and sink, and does not join the transform. The following steps to simplify this example build topology:

I. Get the flow descriptor contained by the MF Source that describes the media stream information that character contains the file (source), such as the number of streams, and related properties such as audio or video, and so on.


Two. Create the source node for each stream, the nodes, equivalent to the pin pin of the dshow, as the representative of the component, and then create sink node for the corresponding stream type (audio or video) respectively, where sink is used to render (render)


Three. Connect to the source node and sink node, there is a problem, that is, the source of data sink not necessarily can be used directly, for example, the encoded data can not be used for direct rendering, but should be added between the mfts, So this topology is Partial,not all, Oh yeah!. Take a look at the code in the example, and the comments are fascinating:

            Connect the source node to the sink node.  The resolver'll find the            //intermediate nodes needed to convert media types.            hr = Psourcenode->connectoutput (0, Poutputnode, 0);

It is also necessary to note that node connection does not represent the real object (component) of the connection, node simply means that component has such a property, the actual component of the connection (real pipeline formation) also need to rely on the media session to complete, just hint!


c) Parse the above partial topology to form a true media pipeline: This work is done by the media seesion, the above description actually ignores one step, is to bind topology to the media session, is to call a function: Settopology, the description of the function can refer to MSDN. Parsing this part of the topology is actually activating the components inside the topology, and inserting the necessary components to convert the media type, hey, don't know if there is a clear, two-step:

To activate the component described by node in topology, B has created sink node but has not instantiated the corresponding audio Render/video render, in order to conserve resources only when it is really needed (just here) to instantiate it to make it available , (in the book so described, actually only see the source code to know,)

Two. Insert the appropriate mfts, so that the source media type can be converted to sink (render) Acceptable media type, that is, to do the media data type conversion, complete the entire pipeline negotiation, so that it can really go to play. This is what the media session does for us, which can be said to turn the partial topology into a real media pipeline.


Ok,that ' s all. This article does not describe the API, you can refer to MSDN, very detailed, just to record the process.

Windows Media Foundation Learning note 3--Media playback

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.