The simplest AV playback example 5:opengl play RGB/YUV

Source: Internet
Author: User
Tags fread time interval win32 win32 window

=====================================================

The simplest audio-visual Playback Example series article List:

Simplest visual Audio Playback Example 1: general statement

The simplest AV playback example 2:gdi play YUV, RGB

Simplest AV Playback example 3:direct3d play Yuv,rgb (via surface)

Simplest AV Playback example 4:direct3d playback RGB (via texture)

The simplest AV playback example 5:opengl play RGB/YUV

Simplest AV Playback example 6:opengl play yuv420p (via texture, using shader)

The simplest AV playback example 7:SDL2 play RGB/YUV

The simplest AV playback example 8:directsound playback PCM

The simplest AV playback example 9:SDL2 playback PCM

=====================================================

This document records the technology of OpenGL video playback. OpenGL is a technology at the same level as Direct3D. It has a cross-platform advantage over Direct3d,opengl. Despite the fact that DirectX has become more influential than OpenGL and is used by most PC game developers in the field of gaming, OpenGL is still irreplaceable because of its color accuracy in professional high-end graphics.
about OpenGL

From the Internet to gather some knowledge about OpenGL introduction, listed here.
Open Graphics Library (English: The Open Graphics Library, abbreviated to OpenGL) is a specification that defines a cross-programming language, cross-platform Application interface (API), which is used to generate two-dimensional, three-dimensional images.
The OpenGL specification is maintained by the OpenGL Architecture Review Board (ARB), which was established in 1992. ARB consists of companies that are particularly interested in creating a unified, universally available API. According to the OpenGL official website, the June 2002 ARB voting members include 3Dlabs, Apple computer, ATI Technologies, Dell computer, Evans & Sutherland, Hewlett-Packard, IBM, Intel, Matrox, NVIDIA, SGI, and Sun Microsystems,microsoft were one of the founding members, but withdrew in March 2003.
OpenGL remains the only API that can replace Microsoft's full control of 3D graphics technology. It still has some vitality, but Silicon Graphics is no longer promoting OpenGL in any way that makes Microsoft unhappy, so it has a high risk. OpenGL occupies a dominant position in high-end graphics devices and professional applications (Direct3D is not currently supported). The open source community (especially the MESA Project) has been working to provide OpenGL support.

OpenGL rendering Pipeline

The following is also online to see, understand a part, but because the foundation of 3D is not solid some aspects have not been fully understood.

The OpenGL rendering pipeline (OpenGL Pipeline) processes graphics information in a specific order, which can be divided into two parts: vertex information (coordinates, normal vectors, and so on) and pixel information (images, textures, etc.). The graphical information is eventually written to the frame cache, the data stored in the frame cache (images), which can be obtained by the application (to save the results, or as an input to the application, as shown in the figure below).

Display list (show lists)
A display list is a set of OpenGL commands that are stored (compiled) for subsequent execution. All data, geometry (vertex) data, and pixel data can be stored in a display list. Data and commands are cached in the display list to improve performance.
Vertex operation (vertex processing)
The vertex and normal coordinates are transformed from the object coordinate system (object coordinates) to the observation coordinate system (eye coordinates) by the pattern view matrix. If lighting is enabled, light calculations are performed on the converted fixed-point and normal coordinates. The light calculation updates the color value of the vertex.

Primitive Assembly (Entity Assembly)

After vertex processing, the basic elements (points, lines, polygons) are transformed by the projection matrix, then cropped by the visual body clipping plane, and converted from the observation coordinate system to the cropping coordinate system. After that, the Perspective Division (divided by W) and the viewport transform (viewport transform) are performed to project the 3d scene into the window coordinate system.

Pixel Transfer operation (pixel operation)

After the pixels are unpacked from the customer's memory, they are scaled, offset, mapped, clamped (clamping). These processes are pixel conversion operations. The transformed data has texture memory or is directly rasterized to fragment (fragment).

Texture memory (texture RAM)

The texture image is loaded into the texture memory and then applied to the Geometry object.

raterization (rasterization)

Rasterization is the process of converting geometry (vertex coordinates, etc.) and pixel data into fragments (fragment), each of which corresponds to a pixel in the frame buffer that corresponds to the color and opacity information of a point on the screen. A fragment is an array of rectangles that contains information such as color, depth, lineweight, point size, and so on (anti-aliasing calculations, and so on). If the rendering mode is set to Gl_fill, the pixel information inside the polygon will be populated at this stage.



As in the above triangle, the three vertex coordinates of the input triangle and its color, the vertex operation transforms the vertex coordinates of the triangle and the normal vector, the color information does not need to be transformed, but the light calculation affects the color information of the vertex. After rasterization, the triangle is separated into a point, not three coordinates, but a series of points, each point stores the corresponding color, depth and opacity information.

Fragment operation (fragment operation)
This is the last processing to be done to convert the fragment to a pixel in the frame buffer. The first is the texture unit (texel) generation. A texture cell is generated from the data in the texture memory and then applied to each fragment. The fog is then calculated. After the fog calculation is completed, several fragment tests are carried out sequentially, such as mask (scissor) test, alpha Test, Template (stencil) test, and depth test. Finally, perform the blending, jitter, logic operation, and masking operations, and the final pixel is deposited into the framebuffer.

the contrast between OpenGL and Direct3D
The technology for video display has been described in the article "Direct3D" and is not repeated here. Read the article on the internet about their different points, write simple and clear, here Quote:
A little contrast between OpenGL and Direct3D
OGL is better than D3D:
OGL is the industry standard, and many non-Windows operating systems cannot find D3D
The color of OGL is better than D3D, and the surface is smoother.
OGL function is very regular, unlike D3D, is the pointer method, the function name is too long.
OGL is a right-handed coordinate system, which is used in mathematics. D3D can also be changed into a right-handed coordinate system, but requires d3dx9_36.dll support
OGL commonly used matrix, such as Worldmatrix are packaged well, D3D to write their own.
OGL is very flexible in drawing, and D3D should be defined in advance FVF, so that all the information is written into the stream before it is drawn. This makes it vertexbuffer and IndexBuffer. As if Microsoft had too much of a D3D buffer. How hard it is to learn. Look at people OGL, where to want this thing.
D3D has a lot of versions, if the video card does not support the waste of firewood a stack. OGL has not changed since a few years ago, so most graphics cards are supported.
Also, I found that D3D's translucent function has a lot of problems. Is the question of the sequence of two translucent objects--the front is blocked by the back.

But D3D also has a better place than OGL:
D3D supports many formats of picture files, and OGL loading JPG has to write the code yourself.
Because D3D is the pointer call mode, so D3D hook is difficult, thereby increasing the difficulty of making the plug.
D3D is a member of DirectX. Programmers to achieve sound playback can be used DirectMusic, matching is always good, and OGL can only draw
D3D is a library of connections that Microsoft has aggressively promoted. Microsoft, on the other hand, is aggressively suppressing OGL, a product that Microsoft is involved in, and how much is being treated. )
Because of this, D3D has become the mainstream of China's large game industry (I think they are blindly follow suit.) In fact, many foreign games are used OGL)

the process of OpenGL video displayUsing OpenGL to play the video in the simplest case requires the following steps:
1. Initialize
1) Initialization
2) Create window
3) Set the drawing function
4) Set the timer
5) Enter the message loop2. Cycle Display screen
1) Adjust display position, image size
2) Drawing
3) show here a little need to explain. That is, OpenGL does not need to use Direct3D, which uses WinMain () as the main function of the Program initialization window. In Direct3D, you must do this by using the Win32 window program and calling CreateWindow () to create a dialog box before you can draw on the dialog box. OpenGL only needs to use the normal console program (the entry function is main ()). Of course, OpenGL can also draw images in the window of the WIN32 program like Direct3D.

The following is a detailed analysis of the above process in conjunction with the example code for OpenGL playback YUV/RGB.
Before detailing the playback process, let's say a little bit about the obvious feeling of learning OpenGL: OpenGL has a lot of functions. OpenGL functions are characterized by a large number, but with fewer parameters per function. and Direct3D features and it just in turn, fewer functions, but more arguments for each function.
1. Initialize 1) Initialize
Glutinit () is used to initialize the glut library. It is prototyped as follows:

void Glutinit (int *argcp, char **argv);

It consists of two parameters: ARGCP and argv. In general, simply pass the ARGC,ARGV in the main () function to it.
Here is a brief introduction to the 3 libraries in OpenGL: Glu,glut,glew
Glu is a utility library that contains 43 functions with a prefix of Glu for the function name. Glu in order to alleviate the heavy programming work, encapsulates the OpenGL function, the Glu function by invoking the core library function, provides the relatively simple usage for the developer, realizes some more complex operation.
Glut is a utility library that is basically used for window interfaces and is cross-platform.

Glew is a cross-platform extension library. are not required. It automatically identifies all OpenGL advanced extension functions supported by the current platform. has not been studied in depth.

Glutinitdisplaymode () is used to set the initial display mode. Its prototype is as follows.

void Glutinitdisplaymode (unsigned int mode)

Where mode can choose the following values or combinations:
Glut_rgb: Specifies the RGB color mode window
Glut_rgba: Specifies the RGBA color mode window
Glut_index: Window specifying the color index mode
Glut_single: Specifying a single-cache window
Glut_double: Specifying a double-cache window
Glut_accum: Windows using the accumulate cache
Glut_alpha: The color component of the window contains the ALPHA value
Glut_depth: Window using deep cache
Glut_stencil: Window using template cache
Glut_multisample: Specifies a window that supports multi-sample functionality
Glut_stereo: Specify a stereo window
Glut_luminance: The window uses the luminance color model it is important to note that if you use double buffering (glut_double), you need to draw with glutswapbuffers (). If you use single buffering (Glut_single), you need to draw with Glflush ().
When using OpenGL to play video, we can use the following code:
Glutinitdisplaymode (glut_double | GLUT_RGB);

2) Create window
Glutinitwindowposition () is used to set the position of the window. You can specify X, y coordinates.
Glutinitwindowsize () Sets the size of the window. You can set the width and height of the window.
Glutcreatewindow () creates a window. You can specify the title of the window.
The above functions are very basic and are not described in detail. Directly post a sample code:
Glutinitwindowposition (+);
Glutinitwindowsize (a);
Glutcreatewindow ("simplest Video Play OpenGL");      

3) Set the drawing function
Glutdisplayfunc () is used to set the drawing function. The operating system calls the function to redraw the form at the necessary moment. Similar to the process of WM_PAINT messages in Windows programming. For example, when a window is moved to the edge of the screen and then moved back, the function is called to redraw the window. Its prototype is as follows.

void Glutdisplayfunc (void (*func) (void));

where (*func) is used to specify the redraw function. For example, when the video is playing, specify the display () function to redraw:

Glutdisplayfunc (&display);

4) Set the timer
When playing a video, you need to play a certain screen (usually 25 frames) per second, so use the timer to call the drawing function to draw the graph every time interval. The prototype of the timer function Gluttimerfunc () is as follows.
void Gluttimerfunc (unsigned int millis, void (*func) (int value), int value);
Its parameters have the following meanings:
Millis: Timed time, in milliseconds. 1 seconds =1000 milliseconds.
(*func) (int value): The function used to specify the timer call.
Value: Parameters are passed to the callback function. Relatively high-end, no contact.
If you only write a gluttimerfunc () function in the main function, you will see that the function is called only once. It is therefore necessary to write a gluttimerfunc () function in the callback function and invoke the callback function itself. Only in this way can the callback function be called repeatedly.
For example, when the video is playing, specify that the Timefunc () function is called every 40 milliseconds:
In the main function:
Gluttimerfunc (timefunc, 0);

This is then set in the Timefunc () function.
void Timefunc (int value) {
    display ();
    Present Frame Every-Ms
    Gluttimerfunc (0, Timefunc);
}

This enables the display () to be called once per 40ms.

5) Enter the message loop
Glutmainloop () will enter the Glut event processing loop. Once called, this program will never return. When the video is playing, the video starts playing after the function is called.

2. Cycle Display screen 1) Adjust display position, image size
This step is mainly to adjust the size and position of the image. If you do not take this step and directly use Gldrawpixels () to draw, you will find that the image is located in the lower left corner of the window, and is upside down (of course, if the window and image size, there is no image in the corner of the problem). The effect is shown in the following figure.

To solve these problems, you need to call the relevant function to transform the image. The transformation uses two functions: glrasterpos3f () and Glpixelzoom ().
glrasterpos3f () to pan the image. Its prototype is as follows.
void glrasterpos3f (Glfloat x, glfloat y, glfloat z);

where x is used to specify the x-coordinate, and y to specify the y-coordinate. Z is not used here yet.
Here is a description of OpenGL coordinates. The origin is at the center of the screen. The corresponding value on the edge of the screen is 1.0. and the coordinate system in mathematics is basically the same. The lower-left corner of the screen is ( -1,-1), and the upper-left corner is ( -1,1).

For example, if we use glrasterpos3f ( -1.0f,0.0f,0), the image will be moved to ( -1,0), as shown in the figure below.


Glpixelzoom () Enlarges, shrinks, and flips the image. Its prototype is as follows.
void Glpixelzoom (Glfloat xfactor, glfloat yfactor);

Where XFactor, Yfactor is used to specify the x-axis, the number of magnification on the y-axis (if the value is less than 1 is reduced). If you specify a negative value, you can implement rollover. As mentioned above, using OpenGL to display pixel data directly will reveal that the image is inverted. Therefore, the image needs to be flipped in the y-axis direction.

For example, the width of pixel data is Pixel_w, pixel_h, and the window size is Screen_w,screen_h, the following code can be used to stretch the image to the window size, and flip:
Glpixelzoom (float) screen_w/(float) Pixel_w,-(float) screen_h/pixel_h);

Combining the above two functions, that is, "Pan + Flip + extrude", you can get full screen image, as shown in the figure below.

PS: This method belongs to the relatively stupid method, there should be a better way. But no further research has been conducted.


2) Drawing
Use Gldrawpixels () to draw the pixel data in the specified memory. The function is prototyped as follows.

void Gldrawpixels (
glsizei width, glsizei height,
glenum format,
glenum type,
const glvoid *pixels);

The meaning of the function's arguments is as follows:
Width: The width of the pixel data.
Height: The height of the pixel data.
Format: Formats for pixel data, such as Gl_rgb,gl_bgr,gl_bgra.
Type: The format of the pixel data in memory.
Pixels: Pointer to the memory that stores the pixel data.
For example, the data is drawn in RGB24 format, the width is pixel_w, the height is pixel_h, and the pixel data is stored in buffer. You can use the following code.
Gldrawpixels (Pixel_w, Pixel_h,gl_rgb, gl_unsigned_byte, buffer);

3) Display
When using double buffering, the Call function Glutswapbuffers () is displayed.
When using a single buffer, the call function Glflush () is displayed.


Summary of the process of video display

The function call structure of the video display can be summarized as the following figure



Code
Paste the source code.

/** * Simplest OpenGL play Video example (OpenGL play RGB/YUV) * Simplest video play OpenGL (OpenGL play RGB/YUV) * * Lei hua Lei Xiaohua * le ixiaohua1020@126.com * Communication University/Digital TV Technology * Communication University of China/digital TV technology * Http://blog.csdn.net /LEIXIAOHUA1020 * * This program uses OpenGL to play RGB/YUV video pixel data. This program can actually only * play RGB (Rgb24,bgr24,bgra) data.
 If the input data is yuv420p * data, it needs to be converted to RGB data before playing.
 * This program is the simplest example of using OpenGL to play pixel data, suitable for OpenGL novice learning.
 * * Function call steps are as follows: * * [Initialize] * GLUTINIT (): Initialize the GLUT library.
 * Glutinitdisplaymode (): Sets the display mode.
 * Glutcreatewindow (): Creates a window.
 * GLUTDISPLAYFUNC (): Sets the drawing function (called when redrawing).
 * GLUTTIMERFUNC (): Set timer.
 * Glutmainloop (): Enter the message loop.
 * * [Cyclic rendering data] * glrasterpos3f (), Glpixelzoom (): Adjusts the display position, the image size.
 * Gldrawpixels (): Draw.
 * Glutswapbuffers (): Display. * * This software plays RGB/YUV raw video data using OpenGL.
 This * Software support show RGB (RGB24, BGR24, BGRA) data in the screen.
 * If The input data is yuv420p, it need to be convert to RGB first. * This program was the simplest example about play RAW Video data * Using OpenGL, suitable for the beginner of OpenGL.
 * The process is shown as follows: * * [init] * GLUTINIT (): Init glut Library.
 * Glutinitdisplaymode (): Set display mode.
 * Glutcreatewindow (): Create a window.
 * Glutdisplayfunc (): Set the display callback.
 * Gluttimerfunc (): Set timer.
 * Glutmainloop (): Start message loop.
 * * [Loop to Render data] * glrasterpos3f (), Glpixelzoom (): Change picture ' s size and position.
 * Gldrawpixels (): Draw.
 * Glutswapbuffers (): Show. */#include <stdio.h> #include "glew.h" #include "glut.h" #include <stdlib.h> #include <malloc.h> #in Clude <string.h>//set ' 1 ' to choose a type of file to play #define LOAD_RGB24 1 #define LOAD_BGR24 0 #define L
Oad_bgra 0 #define LOAD_YUV420P 0 int screen_w=500,screen_h=500;
const int Pixel_w = PIXEL_H = 180;
Bit per Pixel #if load_bgra const int bpp=32; #elif load_rgb24|
Load_bgr24 const int bpp=24;
#elif load_yuv420p const int bpp=12; #endif//YUV file File *FP = NULL;
unsigned char BUFFER[PIXEL_W*PIXEL_H*BPP/8];

unsigned char buffer_convert[pixel_w*pixel_h*3]; inline unsigned char convert_adjust (double tmp) {return (unsigned char) (TMP >= 0 && tmp <= 255) TMP: (tmp < 0?
0:255));
	}//yuv420p to RGB24 void convert_yuv420ptorgb24 (unsigned char* yuv_src,unsigned char* rgb_dst,int nwidth,int nHeight) {
	unsigned char *tmpbuf= (unsigned char *) malloc (nwidth*nheight*3);
	unsigned char y,u,v,r,g,b;
	unsigned char* Y_planar,*u_planar,*v_planar;
	int Rgb_width, u_width;
	Rgb_width = nwidth * 3;
	U_width = (nwidth >> 1);
	int ypsize = nwidth * nheight;
	int upsize = (ypsize>>2);

	int offSet = 0;
	Y_planar = YUV_SRC;
	U_planar = yuv_src + ypsize;

	V_planar = U_planar + upsize;
			for (int i = 0, i < nheight; i++) {for (int j = 0; J < nwidth; J + +) {//Get the Y value from the y planar
			Y = * (Y_planar + nwidth * i + j); Get the V value from the u planar offSet = (i>>1) * (U_width) + (j>>1);
			V = * (U_planar + offSet);

			Get the U value from the v planar u = * (V_planar + offSet);
			Cacular the R,g,b values//Method 1 R = Convert_adjust ((Y + (1.4075 * (V-128)));
			G = Convert_adjust ((Y-(0.3455 * (U-128)-0.7169 * (V-128)));
			B = Convert_adjust ((Y + (1.7790 * (U-128)));
			/*//The following formulas is from MicroSoft ' MSDN int c,d,e;
			Method 2 C = Y-16;
			D = U-128;
			E = V-128;
			R = Convert_adjust ((298 * C + 409 * E + +) >> 8);
			G = Convert_adjust ((298 * C-100 * D-208 * E + +) >> 8);
			B = Convert_adjust ((298 * C + 516 * D + +) >> 8); 
			R = ((R-128) *. 6 + +) >255?255: (R-128) *. 6 + 128; 
			G = ((G-128) *. 6 + +) >255?255: (G-128) *. 6 + 128; 
			B = ((B-128) *. 6 + +) >255?255: (B-128) *. 6 + 128;

			*/OffSet = Rgb_width * i + j * 3;
			Rgb_dst[offset] = B;
			Rgb_dst[offset + 1] = G;
		Rgb_dst[offset + 2] = R;
}
	}	Free (TMPBUF);
        } void Display (void) {if (fread (buffer, 1, PIXEL_W*PIXEL_H*BPP/8, fp)! = PIXEL_W*PIXEL_H*BPP/8) {//Loop
        Fseek (FP, 0, Seek_set);
    Fread (buffer, 1, PIXEL_W*PIXEL_H*BPP/8, FP);
	//make picture full of Windows//move to ( -1.0,1.0) glrasterpos3f ( -1.0f,1.0f,0);
	


Zoom, Flip glpixelzoom (float) screen_w/(float) Pixel_w,-(float) screen_h/(float) pixel_h);
#if Load_bgra gldrawpixels (Pixel_w, Pixel_h,gl_bgra, gl_unsigned_byte, buffer);
#elif load_rgb24 gldrawpixels (Pixel_w, Pixel_h,gl_rgb, gl_unsigned_byte, buffer);
#elif load_bgr24 gldrawpixels (Pixel_w, Pixel_h,gl_bgr_ext, gl_unsigned_byte, buffer);
	#elif load_yuv420p convert_yuv420ptorgb24 (buffer,buffer_convert,pixel_w,pixel_h);
Gldrawpixels (Pixel_w, Pixel_h,gl_rgb, Gl_unsigned_byte, Buffer_convert);

	#endif//glut_double glutswapbuffers ();
Glut_single//glflush ();
    } void Timefunc (int value) {display ();
Present Frame Every-ms Gluttimerfunc (0, Timefunc);

}

int main (int argc, char* argv[]) {#if Load_bgra fp=fopen ("...
/test_bgra_320x180.rgb "," rb+ "); #elif load_rgb24 Fp=fopen (".
/test_rgb24_320x180.rgb "," rb+ "); #elif load_bgr24 Fp=fopen (".
/test_bgr24_320x180.rgb "," rb+ "); #elif load_yuv420p Fp=fopen (".
/test_yuv420p_320x180.yuv "," rb+ ");
		#endif if (fp==null) {printf ("Cannot open this file.\n");
	return-1;  
	}//GLUT init glutinit (&AMP;ARGC, argv); Double, use Glutswapbuffers () to show Glutinitdisplaymode (glut_double |
	GLUT_RGB); Single, use Glflush () to show//glutinitdisplaymode (Glut_single |

    GLUT_RGB);
    Glutinitwindowposition (100, 100);
    Glutinitwindowsize (Screen_w, Screen_h);
	Glutcreatewindow ("simplest Video Play OpenGL");
	printf ("Simplest Video Play opengl\n");
	printf ("Lei xiaohua\n");
    printf ("http://blog.csdn.net/leixiaohua1020\n");

    printf ("OpenGL Version:%s\n", glgetstring (gl_version));
    Glutdisplayfunc (&display); 
    
    Gluttimerfunc (timefunc, 0);
    start! GlUtmainloop ();
return 0; }



Code Considerations 1. You can determine which format of the pixel data (bgra,rgb24,bgr24,yuv420p) to read by setting the macro that defines the start of the file.

Set ' 1 ' to choose a type of file to play
#define LOAD_RGB24   1
#define LOAD_BGR24   0
#define Load_bgra    0
#define LOAD_YUV420P 0

2. The width height of the window is screen_w,screen_h. The pixel data has a width height of pixel_w,pixel_h. They are defined as follows.
Width, Height
const int screen_w=500,screen_h=500;
const int pixel_w=320,pixel_h=180;

3. Notice how the display differs
Bgra,bgr24,rgb24 These 3 formats can be displayed directly in the Gldrawpixels () setting in pixel format. And yuv420p is not directly displayed. The example in this article is to convert yuv420p to RGB24 first and then display it.

Run Results

Regardless of which file you choose to load, the result is the same, as shown in the following illustration.



The download code is in "simplest Media Play"


SourceForge Project address: https://sourceforge.net/projects/simplestmediaplay/

CSDN Download Address: http://download.csdn.net/detail/leixiaohua1020/8054395

Note:

The project will update and fix some minor issues on a regular basis, please refer to the General page of this series for the latest version:

"Simplest AV Playback Example 1: general statement"

The above project contains examples of multimedia playback using various APIs (DIRECT3D,OPENGL,GDI,DIRECTSOUND,SDL2). Where the audio input is PCM sampled data. The sound card that is output to the system is played out. The video input is Yuv/rgb pixel data. Output to a window on the monitor to play it out.

Through this project the code beginner can quickly learn to use these APIs to play video and audio techniques.

The

includes the following sub-projects: Simplest_audio_play_directsound:         using DirectSound to play PCM audio sampling data.
Simplest_audio_play_sdl2:                       using SDL2 to play PCM audio sampling Data.
Simplest_video_play_direct3d:                 using Direct3D's surface playback rgb/ YUV Video pixel data.
Simplest_video_play_direct3d_texture: Uses Direct3D's texture to play RGB video pixel data.
Simplest_video_play_gdi                           Use G Di plays RGB/YUV video pixel data.
Simplest_video_play_opengl:                   using OpenGL to play RGB/YUV video pixel data.
Simplest_video_play_opengl_texture:     uses OpenGL texture to play YUV video pixel data.
Simplest_video_play_sdl2:                       SDL2 play R GB/YUV video pixel data.
 
 
 
 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.