Simplify Linux multimedia development with gstreamer

Source: Internet
Author: User
Tags gstreamer

I. Basic Concepts

Gstreamer is recommended for GNOME Desktop streaming.


Ying
The framework uses the architecture based on plug-ins and pipelines. All functional modules in the framework are implemented as pluggable components.
(Component), and can be easily installed to any pipeline as needed. Because all plug-ins use the pipeline mechanism for unified data exchange, it is easy to use the existing
The plug-in "assembles" a multimedia application with excellent functionality.
1.1 component Processing

For programmers who need to apply the gstreamer framework
It is a concept that must be understood, because it is the basic component of the pipeline and the basis of all available components in the framework. It is no wonder that most functions in the gstreamer framework involve
The operation of the ststelement object. From the perspective of gstreamer
It can be described as a black box with specific properties. It interacts with the outside world through the link point, and represents its own features or functions to the rest of the framework.
According to their respective functional differences, gstreamer further subdivided the ststamp element into the following categories: • Source Element
The data source component only has the output end. It can only be used to generate data for pipeline consumption, but cannot process the data. An example of a typical data source component is an audio capture unit that reads the original
Audio data is then provided to other modules as a data source. • Filter element
The filter element has both input and output ends. It obtains the corresponding data from the input end and passes the data to the output end after special processing. An example of a typical filter element is audio encoding.
A single element is used to obtain audio data from the outside world, encode the data based on a specific compression algorithm, and then provide the encoded results to other modules. • Sink Element
The receiver has only the input end. It can only consume data and is the terminal of the entire media pipeline. An example of a typical Receiver component is the audio playback unit, which writes the received data to the sound card,
This is usually the last step in the audio processing process.
Figure 1 helps you better understand the differences among the data source components, Filter components, and receiver components, as well as how they work together to form a pipeline:


Figure 1

It should be noted that the specific form of filter elements is very flexible, gstreamer does not strictly specify the number of input and output, in fact they can be one or more. Figure 2
Yes
The basic structure of an AVI separator, which can separate the input data into separate Audio Information and video information. The filter element used to implement this function obviously only has one input, however, two outputs are required.
.


Figure 2

The only way to create a gstelement object in an application is to use the factory object gstelementfactory. Because the gstreamer framework provides
More
Type. Therefore, multiple types of ststelementfactory objects are provided, which are differentiated by specific factory names. For example,

The following code obtains a factory object named mad through the stst_element_factory_find () function, which can then be used to create the corresponding MP3 decoder element.
Parts:

Maid factory * factory; factory = maid ("mad ");

After the factory object is obtained successfully, you can use the maid () function to create a specific ststelement object. This function

There are two parameters for calling: The factory object to be used and the name of the component to be created. Component names can be obtained through queries, or factory pairs can be generated by passing in a null pointer.
The default component of the image. The following code demonstrates how to use the obtained factory object to create an MP3 decoder component named decoder:

Stststelement * element; element = maid (factory, "decoder ");

When the created gstelement is no longer used, you must call the gst_element_unref () function to release the memory resources it occupies:

Ststst_element_unref (element );

Gstreamer
The same mechanism as gobject is used to manage properties, including query, set, and get. All
All the ststelement objects must inherit the most basic attribute name (name) from its parent object, because
Functions such as ststst_element_factory_make () and stst_element_factory_create () are used to create factory objects and component objects.
The name attribute is always used. You can set and read it by calling the functions of stst_object_set_name () and ststst_object_get_name ().
The name attribute of the ststelement object.

1.2 liner handling

Pad is another basic concept introduced by the gstreamer framework. It refers to the connection channel between elements and the outside.
Said, it can
The media type that can be processed is exposed to other components through the pad. After the ststelement object is successfully created, you can use the stst_element_get_pad () method to obtain the element.
. For example, the following code returns the pad named Src in the element:

Maid * srcpad; srcpad = maid (element, "src ");

If necessary, you can use the maid function to query all the pads in the specified component. For example, the following code outputs the names of all pads in the element:

Glist * pads; pads =
Gst_element_get_pad_list (element); While (pads) {gstpad * pad = gst_pad
(Pads-> data); g_print ("Pad name is: % s/n", maid
(PAD); pads = g_list_next (pads );}

Like the component, the pad name can be dynamically set or read by calling
() And stst_pad_set_name () functions. The pads of all components can be subdivided into two types: the Input Pad and the output pad. The input pad can only receive data but cannot generate data.
Data, while the output liner is the opposite. Only data can be generated but data cannot be received. The type of the specified liner can be obtained by using the function stst_pad_get_direction.
All the gaskers in the gstreamer framework must be attached to a certain component. You can call the gst_pad_get_parent () method to obtain the component to which the specified gasket belongs. This function returns
The value is a pointer to the ststelement.
To some extent, the liner can be seen as the spokesperson of the component because it is responsible for describing the capabilities of the component to the outside world. The gstreamer framework provides a unified mechanism for the liner to describe the features of the component.
Capability, which is achieved by using the data structure _ ststcaps:

Struct _ ststcaps {gchar * Name;
/* The name of this Caps */guint16 ID;/* type ID (major type)
*/Guint refcount;/* caps are refcounted */ststprops * properties ;/*
Properties for this capability */ststcaps * Next;/* caps can be chained
Together */};

Below is
The ability of the mad component is described. It is not difficult to see that the component actually contains the sink and SRC pads, and each pad carries specific function information. The PAD named sink is the input of the mad component.

It can accept media data of the MIME type audio/MP3, and has three attributes: layer, bitrate, and framed. The PAD named SRC is mad yuan.
It is responsible for generating the MIME type audio/raw media data, and has multiple attributes such as format, depth, rate, and channels.

Pads:
Sink template: 'sind'
Availability: Always

Capabilities: 'mad _ sink ':
Mime Type: 'audio/mp3 ':
SRC template: 'src'
Availability: Always
Capabilities: 'mad _ src ':
Mime Type: 'audio'/raw ':
Format: String: int
Endianness: INTEGER: 1234
Width: INTEGER: 16
Depth: INTEGER: 16
Channels: Integer Range: 1-2
Law: INTEGER: 0
Signed: Boolean: True
Rate: Integer Range: 11025-48000

To be precise, each liner in the gstreamer framework may correspond to multiple capability descriptions, which can be obtained through the function stst_pad_get_caps. For example, the following code outputs the names and MIME types that can be described in the pad:

 
 
  1. GstCaps *caps; 
  2. caps = gst_pad_get_caps (pad); 
  3. g_print ("pad name is: %s/n", gst_pad_get_name (pad)); 
  4. while (caps) { 
  5.   g_print (" Capability name is %s, MIME type is %s/n",
  6.   gst_caps_get_name (cap),  gst_caps_get_mime (cap)); 
  7.   caps = caps->next; 

1.3 Cabinet

The bin is a container component in the gstreamer framework. It is usually used to hold other component objects. However, because it is also a stststelement object
Up
Can be used to accommodate other cabinet objects. The cabinet can be used to combine multiple components to be processed into one logical component. As you do not need to operate the components in the cabinet one by one, it is easy to use it.
Construct more complex pipelines. Another advantage of using the cabinet in the gstreamer framework is that it will try to optimize the data stream, which is very attractive for multimedia applications.
Figure 3 describes the typical structure of the Cabinet in the gstreamer framework:


Figure 3

There are two main types of cabinets used in gstreamer applications:

• The gstpipeline pipeline is the most commonly used container. For a gstreamer application, its top-level Cabinet must be a pipeline.
• The role of the ststthread thread is to provide synchronization processing capabilities. This type of cabinet is generally used if the gstreamer application requires strict audio/video synchronization.

The gstreamer framework provides two methods to create a cabinet: A factory method and a specific function. The following code demonstrates how to use the factory method to create a thread object and how to use a specific function to create a pipeline object:

Stststelement * thread, * pipeline;
// Create a thread object and specify a unique name for it.
Thread = maid ("Thread", null );
// Create a specific pipeline object based on the given name.
Pipeline = maid ("pipeline_name ");

After the cabinet is successfully created, you can call the stst_bin_add () function to add existing components to it:

Gstelement * element;
* Bin; bin = maid ("bin_name"); element = maid
("Mpg123", "decoder"); maid (BIN), element );

It is also easy to find specific components from the Cabinet, which can be implemented by using the stst_bin_get_by_name () function:

Maid * element; element = maid (BIN), "decoder ");

By
One cabinet in the gstreamer framework can be added to another cabinet. Therefore, nested cabinets may occur.
When looking for components, the nested cabinet is recursively searched. After the component is added to the cabinet, it can be removed as needed. This is done by calling the stst_bin_remove () function.
Of:

Maid * element; maid (BIN), element );

If you carefully study the Cabinet described in Section 1 3, you will find that it does not have its own input and output cushions, so it is obviously unable to interact with other components as a logical whole. To solve this problem
Gstreamer introduces ghost
PAD) concept. It is selected from the liner of all components in the cabinet. Generally, both the input liner and output liner are selected, as shown in Figure 4:


Figure 4

The Cabinet with a Sprite liner is completely the same as the component in behavior, and all the properties of the component are the same. All operations on the component can also be performed on the cabinet, therefore, such cabinets can be used in gstreamer applications like components. The following code demonstrates how to add a wizard liner to the cabinet:

 
 
  1. GstElement *bin; 
  2. GstElement *element; 
  3. element = gst_element_factory_create ("mad", "decoder"); 
  4. bin = gst_bin_new ("bin_name"); 
  5. gst_bin_add (GST_BIN (bin), element); 
  6. gst_element_add_ghost_pad(bin,gst_element_get_pad(element,"sink"),"sink"); 

Ii. component connection

After the concept of component and liner is introduced, the process of gstreamer processing multimedia data becomes very clear: by connecting the pads of different components in turn to form a Media Processing Tube
To enable the data to be processed properly by each component during the pipeline process, and finally implement specific multimedia functions.
Figure 1 describes a simple pipeline consisting of three basic components: the data source component is only responsible for generating data, and its output pad is connected to the input pad of the filter element; the filter element is responsible for the input from its own

The data is obtained in the liner and the result is sent to the receiver component connected to it through the output liner after specific processing. The Receiver component is only responsible for receiving data, its Input Pad and output pad of the filter element
To process the final result.
The components in the gstreamer framework are connected by their respective pads. The following Code demonstrates how to connect two components through the pads and how to disconnect them as needed.
Connection:

 
 
  1. Gstpad * srcpad, * sinkpad;
  2. Srcpad = maid (element1, "src ");
  3. Sinpad = maid (element2, "sink ");
  4. // Connection
  5. Ststst_pad_link (srcpad, sinkpad );
  6. // Disconnect
  7. Ststst_pad_unlink (srcpad, sinkpad );

If only one Input Pad and one output pad are required for the component to establish a connection, it is easier to call the ststst_element_link () function to directly establish a connection between them, or call the maid () function to disconnect them:

// Connection
Ststst_element_link (element1, element2 );
// Disconnect
Ststst_element_unlink (element1, element2 );

Iii. component Status

After the components in the gstreamer framework are connected through pipelines, they start their respective processing processes. During this process, the status switches are usually performed multiple times, and each component will be in
One of the following four States: • null is the default state of all components, indicating that it has just been created and has not started to do anything. • Ready
Indicates that the component is ready and the processing process can be started at any time. • Paused indicates that the component temporarily stops processing data for some reason. • Playing indicates that the component is processing data.

All components start from the null state and are converted from the null, ready, paused, and playing States in turn. The current status of the component can be called
Gst_element_set_state () function:

Gstelement * bin;/* Create a component and connect it to the bin of the Cabinet */gst_element_set_state (bin, gst_state_playing );

Mo
When the media transcoding queue is created, the media transcoding queue and all its components are null. After the MPs queue is used up, do not forget to switch the MPs queue status back.
The null state allows all components in it to have the opportunity to release the resources they are occupying.
The real processing process of an MPS queue starts when it is switched to the ready status for the first time. At this time, the MPs queue and all its components will be initialized to ensure accuracy of the data processing process to be executed.
Backup. For a typical component, operations to be performed in ready state include opening media files and audio devices, or attempting to connect to a remote Media Server

Create
Establish a connection.
Once the pipeline in ready state switches to playing state, the multimedia data to be processed flows throughout the pipeline, and is processed by the various components contained in the pipeline in turn.
Implement a multimedia function pre-defined by the MPs queue. The gstreamer framework also allows the pipeline to switch directly from the null state to the playing State without passing through the intermediate ready state.
Status.
A media transcoding queue in the playing status can be switched to the paused status at any time. It temporarily stops the flow of all data in the media transcoding queue and switches back to the playing status as needed. If you want to insert or
To change a component in the MPs queue, you must first switch it to the paused or null state. When the component is in the paused state, it does not release the resources it occupies.

4. implement mp3 players

After understanding some basic concepts and processing procedures, let's take a look at how to use the components provided by the gstreamer framework to implement a simple MP3 player. Structure energy described in Figure 1
It can be easily mapped to an MP3 player. The data source component reads data from the disk, the filter component decodes the data, and the receiver component writes the decoded data to the sound card.
Like many other gnome projects, gstreamer uses the C Language

. To apply various functions provided by gstreamer in a program, you must first call the main function to complete the initialization, in this way, the parameters entered by the user from the command line are passed to the gstreamer function library. The initial implementation of a typical gstreamer application is as follows:

# Include <GST/GST. h>

Int main (INT argc, char * argv []) {

Gst_init (& argc, & argv );

/*...*/

}

Next, you need to create three components and connect them to the pipeline. Since all gstreamer components have the same base class, the following method can be used for definition:

Gstelement * pipeline, * filesrc, * decoder, * audiosink;

The pipeline is used to hold and manage components in the gstreamer framework. The following Code creates a new pipeline named pipeline:

/* Create a new pipeline to hold components */pipeline = maid ("Pipeline ");

The data source element reads data from a disk file. It has the location attribute to indicate the location of the file on the disk. You can use the standard gobject attribute mechanism to set the corresponding attributes for the component:

/* Create a data source component */filesrc =
Maid ("filesrc", "disk_source"); g_object_set
(G_object (filesrc), "location", argv [1], null );

The filter component is responsible for decoding MP3 data. The simplest way is to install the mad plug-in to complete the corresponding decoding work:

/* Create a filter element */decoder = maid ("mad", "decoder ");

The receiver component is responsible for playing the decoded data using the sound card:

/* Create a Receiver component */

Audiosink = maid ("audiosink", "play_audio ");

All the three elements that have been created must be added to the MPs queue and connected in order:

/* Add a component to the MPs queue */

Maid (maid (pipeline), filesrc, decoder, audiosink, null );

/* Connect the component through the liner */

Ststst_element_link_der (filesrc, decoder, audiosink, null );

After all the preparations are completed, you can switch the MPs queue status to the playing status to start the data processing process of the entire MPs queue:

/* Start the Pipeline */ststst_element_set_state (pipeline, ststst_state_playing );

Because threads are not used, you must call the stst_bin_iterate () function to determine when the pipeline processing will end:

While (maid (pipeline )));

As long as new events are still generated in the MPs queue, the stst_bin_iterate () function returns true all the time. The function returns false only when the entire process ends, in this case, terminate the pipeline and release the occupied resources:

/* Terminate the MPs queue */ststst_element_set_state (pipeline, ststst_state_null);/* release the resource */ststst_object_unref (stst_object (pipeline ));

The source code of the MP3 player implemented by gstreamer is as follows:

 
 
  1. # Include <GST/GST. h>
  2.  
  3. Int main (INT argc, char * argv []) {
  4. Gstelement * pipeline, * filesrc, * decoder, * audiosink;
  5. Gst_init (& argc, & argv );
  6. If (argc! = 2 ){
  7. G_print ("Usage: % S <MP3 FILENAME>/N", argv [0]);
  8. Exit (-1 );
  9. }
  10. /* Create a new pipeline */
  11. Pipeline = maid ("Pipeline ");
  12. /* Generate the component used to read hard disk data */
  13. Filesrc = maid ("filesrc", "disk_source ");
  14. G_object_set (g_object (filesrc), "location", argv [1], null );
  15. /* Create a decoder component */
  16. Decoder = maid ("mad", "decoder ");
  17. /* Create an audio playback component */
  18. Audiosink = maid ("osssink", "play_audio ");
  19. /* Add the generated component to the MPs queue */
  20. Maid (maid (pipeline), filesrc, decoder, audiosink, null );
  21. /* Connect each component */
  22. Ststst_element_link_der (filesrc, decoder, audiosink, null );
  23. /* Start playing */
  24. Ststst_element_set_state (pipeline, stst_state_playing );
  25. While (maid (pipeline )));
  26. /* Stop MPs queue process */
  27. Ststst_element_set_state (pipeline, stst_state_null );
  28. /* Release the occupied Resources */
  29. Ststst_object_unref (stst_object (pipeline ));
  30. Exit (0 );
  31. }

V. Summary

With the increasing popularity of GNOME desktop environments, gstreamer, as a powerful multimedia application development framework, has attracted more and more attention. Gstreamer adopts a flexible architecture and provides many predefined media processing modules. Therefore, gstreamer can be greatly simplified in Linux.

The difficulty of developing multimedia applications.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.