Android openGL ES2 everything starts from texture painting, androides2

Source: Internet
Author: User
Tags clear screen

Android openGL ES2 everything starts from texture painting, androides2

Texture. in openGL, it can be understood as loading the image to the video card Display memory. Android devices started to support openGL ES2.0 in version 2.2, and previously were ES1.0 and ES1.1 versions. To put it simply, openGL ES aims to embed the openGL version of the device after functional tailoring. ES2.0 is incompatible with 1.x. For differences and compatibility, see the official android documentation.

First, android uses openGL to provide a special view based on GLSurfaceView. Our view must inherit GLSurfaceView. The following is a simple example:

public class MyGLSurfaceView extends GLSurfaceView {     public MyGLSurfaceView(Context context) {        super(context);        setFocusableInTouchMode(true);         // Tell the surface view we want to create an OpenGL ES 2.0-compatible        // context, and set an OpenGL ES 2.0-compatible renderer.        this.setEGLContextClientVersion(2);         this.setRenderer(new MyRenderer());    } }

There is nothing special about it. The rendering operation of android view requires a render interface, and the rendering interface of GLSurfaceView is android. opengl. GLSurfaceView. Renderer. We need to implement the interface method.

public class MyRenderer implements Renderer { public void onDrawFrame(GL10 gl) {} public void onSurfaceChanged(GL10 gl, int width, int height) {} public void onSurfaceCreated(GL10 gl, EGLConfig config) {} }

Interface implementation of three methods, corresponding to painting, drawing area changes, Region creation. It must be noted that the GL10 gl parameter is an object of openGL es1.x. We will not use it here. Another point is that the onDrawFrame method is called systematically and does not need to be called manually. The system will continuously call back at a certain frequency.

 

Next we will go to ES2.0, and first go to the Code:

public void onSurfaceCreated(GL10 gl, EGLConfig config) {    GLES20.glEnable(GLES20.GL_TEXTURE_2D);    // Active the texture unit 0    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);     loadVertex();    initShader();    loadTexture();}



When creating the drawing area, we set to enable the 2D Texture and activated the Texture unit unit0. What do you mean? It takes a long time to say it later. To put it simply, remember that openGL is state-based, that is, many State settings and switching. Enabling GL_TEXTURE_2D here is a state enabling, indicating that openGL can use 2D textures.

Shenma activates the Texture unit, which has something to do with the hardware. openGL wants the graphics card to divide the storage area for storing the texture into more than one area. Unit 0 is used here. Multiple textures can be used to draw multiple textures. Next, we call three functions to load vertices, initialize the shader, and load textures.


First, when the vertex is loaded, openGL draws a graph based on the link after the vertex. Why? In fact, this is a powerful design. Vertices are coordinate points containing location information. The Code is as follows:

private void loadVertex() {    // float size  = 4    this.vertex = ByteBuffer.allocateDirect(quadVertex.length * 4)                            .order(ByteOrder.nativeOrder())                            .asFloatBuffer();     this.vertex.put(quadVertex).position(0);     // short size = 2    this.index = ByteBuffer.allocateDirect(quadIndex.length * 2)                           .order(ByteOrder.nativeOrder())                           .asShortBuffer();     this.index.put(quadIndex).position(0);} private FloatBuffer vertex;private ShortBuffer index; private float[] quadVertex = new float[] {        -0.5f, 0.5f, 0.0f, // Position 0        0, 1.0f, // TexCoord 0         -0.5f, -0.5f, 0.0f, // Position 1        0, 0, // TexCoord 1         0.5f , -0.5f, 0.0f, // Position 2        1.0f, 0, // TexCoord 2         0.5f, 0.5f, 0.0f, // Position 3        1.0f, 1.0f, // TexCoord 3}; private short[] quadIndex = new short[] {        (short)(0), // Position 0        (short)(1), // Position 1        (short)(2), // Position 2         (short)(2), // Position 2          (short)(3), // Position 3          (short)(0), // Position 0};

FloatBuffer and writable buffer are encapsulation objects that encapsulate the local data structure. Yes, the data in the two objects is not managed by the Java virtual machine, which is equivalent to the C language storage method. For more information, see here ). QuadVertex data is the coordinate of a rectangle and texture. One or two sentences are hard to explain. Here we will discuss several typical Coordinate Systems of openGL. In summary, openGL coordinates are unitized, all of which are floating-point 0.0-1.0, and the center of the screen is (0, 0 ). The lower left corner of the texture coordinate is (0, 0 ). Here, quadVertex places an image in a rectangle on the screen. position0 is the top left vertex, followed by the order in the lower left, lower right, and upper right corner. The texture coordinates are the same.

QuadIndx refers to the arrangement of these vertex indexes. Here, a rectangle has four vertices. Each vertex has three coordinate positions and two texture coordinates. That is to say, a vertex has five float data. Why are the vertices arranged in this way? Next time, two triangles are merged into a rectangle, which is hard to explain in a few words.

Therefore, this code stores the location of the rectangle and the texture coordinates to the local data and is used later.

 

Second, initialize the shader. This colorant is a special feature of ES2.0, also known as the programmable colorant. It is also different from ES1.x. Here is a brief introduction. The programmable shader is a script with a syntax similar to the C language. The script is divided into vertex pasters and fragment pasters, which correspond to different rendering flows of openGL respectively. As follows:

Vertex shader:

uniform mat4 u_MVPMatrix; attribute vec4 a_position;attribute vec2 a_texCoord; varying vec2 v_texCoord;  void main()  {    gl_Position = a_position;    v_texCoord  = a_texCoord;    }

Fragment shader:

precision lowp float;        varying vec2 v_texCoord;                       uniform sampler2D u_samplerTexture; void main()                                          {                                                      gl_FragColor = texture2D(u_samplerTexture, v_texCoord);}

Remember one sentence here. The Vertex coloring tool will execute on the vertex, and the segment coloring tool will execute on the pixel. The rectangle just now has four vertices, and each vertex will apply this script. That is to say, vertices are location-related information, and fragments are color-texture-related information.

These two scripts are all text and need to be compiled, linked, and other operations before they can be used by ES2.0. The process is like the C language compilation and running process. OpenGL provides related functions to do these tasks. As follows:

private void initShader() {    String vertexSource   = Tools.readFromAssets("VertexShader.glsl");    String fragmentSource = Tools.readFromAssets("FragmentShader.glsl");     // Load the shaders and get a linked program    program = GLHelper.loadProgram(vertexSource, fragmentSource);     // Get the attribute locations    attribPosition = GLES20.glGetAttribLocation(program, "a_position");    attribTexCoord = GLES20.glGetAttribLocation(program, "a_texCoord");     uniformTexture = GLES20.glGetUniformLocation(program, "u_samplerTexture");     GLES20.glUseProgram(program);    GLES20.glEnableVertexAttribArray(attribPosition);    GLES20.glEnableVertexAttribArray(attribTexCoord);    // Set the sampler to texture unit 0    GLES20.glUniform1i(uniformTexture, 0);}

We can see that the vertex and fragment constitute a program, which can be used by openGL. It is a compiled script program and stored in the video memory. GLES20.glGetAttribLocation and GLES20.glGetUniformLocation are the roles of Shenma. To put it simply, the java program communicates with the Shadow script. Just like passing parameters, the script can change the processing process of openGL assembly line rendering in real time based on changes in external parameters.


The following is an auxiliary method for loading the shadow package:

public static int loadProgram(String vertexSource, String fragmentSource) {     // Load the vertex shaders    int vertexShader = GLHelper.loadShader(GLES20.GL_VERTEX_SHADER, vertexSource);     // Load the fragment shaders    int fragmentShader = GLHelper.loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentSource);     // Create the program object    int program = GLES20.glCreateProgram();     if (program == 0) {        throw new RuntimeException("Error create program.");    }     GLES20.glAttachShader(program, vertexShader);    GLES20.glAttachShader(program, fragmentShader);     // Link the program    GLES20.glLinkProgram(program);     int[] linked = new int[1];     // Check the link status    GLES20.glGetProgramiv(program, GLES20.GL_LINK_STATUS, linked, 0);     if (linked[0] == 0) {        GLES20.glDeleteProgram(program);        throw new RuntimeException("Error linking program: " +  GLES20.glGetProgramInfoLog(program));    }     // Free up no longer needed shader resources    GLES20.glDeleteShader(vertexShader);    GLES20.glDeleteShader(fragmentShader);     return program;}

public static int loadShader(int shaderType, String source) {      // Create the shader object    int shader = GLES20.glCreateShader(shaderType);     if (shader == 0) {        throw new RuntimeException("Error create shader.");    }     int[] compiled = new int[1];     // Load the shader source    GLES20.glShaderSource(shader, source);     // Compile the shader    GLES20.glCompileShader(shader);     // Check the compile status    GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compiled, 0);     if (compiled[0] == 0) {        GLES20.glDeleteShader(shader);        throw new RuntimeException("Error compile shader: " + GLES20.glGetShaderInfoLog(shader));    }     return shader;}

Why do many operations of openGL target int type, Because openGL will only generate or bind the address in the video memory, return the id, and then use the id to change its internal state as the handle.

 

The third is to load the texture. To load a texture, upload the image data to the video storage and use it later. Note that the length and width of the texture image are preferably 2 to the Npower, otherwise it may not be drawn.

static int[] loadTexture(String path) {     int[] textureId = new int[1];      // Generate a texture object     GLES20.glGenTextures(1, textureId, 0);      int[] result = null;      if (textureId[0] != 0) {          InputStream is = Tools.readFromAsserts(path);          Bitmap bitmap;         try {             bitmap = BitmapFactory.decodeStream(is);         } finally {             try {                 is.close();             } catch (IOException e) {                 throw new RuntimeException("Error loading Bitmap.");             }         }          result = new int[3];         result[TEXTURE_ID] = textureId[0]; // TEXTURE_ID         result[TEXTURE_WIDTH] = bitmap.getWidth(); // TEXTURE_WIDTH         result[TEXTURE_HEIGHT] = bitmap.getHeight(); // TEXTURE_HEIGHT          // Bind to the texture in OpenGL         GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId[0]);          // Set filtering         GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);         GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);          GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);         GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);          // Load the bitmap into the bound texture.         GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);          // Recycle the bitmap, since its data has been loaded into OpenGL.         bitmap.recycle();      } else {         throw new RuntimeException("Error loading texture.");     }      return result; }

The code is clear at a glance. Here we use the android tool class bitmap to directly convert it into the format required by openGL textures. The process is that a texture id is generated on the video card. Then, the texture data will be uploaded Based on the id, and the texture can be operated after the id is saved. As for the texture filtering feature settings, we will talk about it next time.

 


Now it seems that the painting is left. We have prepared the vertex information and the texture coordinates corresponding to the vertex. The coloring tool is initialized and the texture image is uploaded. Next we have drawn them together.

public void onDrawFrame(GL10 gl) {    // clear screen to black    GLES20.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);     GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);     vertex.position(0);    // load the position    // 3(x , y , z)    // (2 + 3 )* 4 (float size) = 20    GLES20.glVertexAttribPointer(attribPosition,                                  3, GLES20.GL_FLOAT,                                  false, 20, vertex);     vertex.position(3);    // load the texture coordinate    GLES20.glVertexAttribPointer(attribTexCoord,                                   2, GLES20.GL_FLOAT,                                  false, 20, vertex);     GLES20.glDrawElements(GLES20.GL_TRIANGLES, 6, GLES20.GL_UNSIGNED_SHORT, index);}

I tried my best to keep the code simple. openGL is based on the State representation. The bind function is everywhere. Here bindTexture is a texture image that notifies openGL to use that id. The next operation is for the bind image. Rendering requires openGL to know how to draw a magic horse. Therefore, we need to use the vertex local data container, which contains vertex and texture coordinate information. GLES20.glVertexAttribPointer is used to upload vertex data to the video card storage according to openGL's favorite format. The draw method is called when the texture id is applied before, so the uploaded texture image is used when the texture coordinates are drawn.


Yes, data needs to be uploaded to openGL every time. After all, the memory and the memory are not in the same place. openGL adopts the client-server design mode. Of course, VBO and other technologies can be used to cache data in the video memory to improve the running performance. Let's talk about it later.




Android opengl es texture Paster Problems

Are you sure you have pasted the texture? If there is no post, what you say may happen!

How to Learn about Android opengl es 20? How can I use this game?

OpenGLES is a thing that makes people crash. It is also necessary to make 3D on the Android mobile phone. Share some of my notes here:

About Android OpenGL ES 20011-6-3
Android uses OpenGL standard interfaces to support 3D graphics. android 3D graphics systems are also divided into java framework and local code.
The local code mainly implements the OpenGL interface library. At the Java framework layer, javax. microedition. khronos. opengles is a java standard OpenGL package,
The android. opengl package provides links between the OpenGL System and the Android GUI system.
The local Android code is located in frameworks/base/opengl,
The JNI code is located in frameworks/base/core/com_google_android_gles_jni_GLImpl.cpp and frameworks/base/core/com_google_android_gles_jni_EGLImpl.cpp,
The java class is located under opengl/java/javax/microedition/khronos
The local test code is in frameworks/base/opengl/tests. 14 test codes, including angeles and fillrate, can be called locally through a terminal (adb shell is used in the simulator ).
OpenGL ES 1.x
Fixed pipeline operations. glVertexPointer () and other functions are supported, but GLSL is not supported. The header file is in the GLES directory of ndk, and the library file is libGLESv1_CM.so.
OpenGL ES 2.x
Programmable pipeline operations are not compatible with 1.x. fixed pipeline operations are not supported, such as glVertexPointer () and other functions. Supports GLSL (this is also required for programming ). The header file is in the GLES2 directory of the ndk, and the library file is libGLESv2.so.

OpenGL ES Learning
OpenGL defines its own data types. We should always use these OpenGL data types to ensure portability and efficiency.
OpenGL ES currently does not support 64-bit data types.
OpenGL ES only supports the triagon.
OpenGL ES only supports functions starting with gl.
Functions of OpenGL ES deleted from OpenGL:
1. glBegin/glEnd
2. glArrayElement
3. display the list
4. Calculator
5. index color mode
6. Custom cropping plane
7. glRect
8. Image Processing (this is not a general graphics card, either FireGL or Quadro)
9. Feedback Buffer
10. SELECT Buffer
11. Cumulative Buffer
12. Border flag
13. glPolygonMode
14. GL_QUADS, GL_QUAD_STRIP, GL_POLYGON
15. glPushAttrib, glPopAttrib, glPushClientAttrib, glPopClientAttrib
16. TEXTURE_1D, TEXTURE_3D, TEXTURE_RECT, TEXTURE_CUBE_MAP
17. GL_COMBINE
18. Automatic texture coordinate generation
19. Texture Boundary
20. GL_CLAMP, GL_CLAMP_TO_BORDER
21. deprecated texture Representation
22. Texture levels
23. Texture preferences
24. Automatic texture compression and decompression
25 ...... the remaining full text>

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.