Understanding of Android opengl ES world coordinate system and texture coordinate system
Every opengl hello world that teaches you how to map images on the screen has two Arrays:
static final float COORD[] = { -1.0f, -1.0f, 1.0f, -1.0f, -1.0f, 1.0f, 1.0f, 1.0f, }; static final float TEXTURE_COORD[] = { 0.0f, 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, };
However, I almost did not explain it, so I didn't know why these points should be written, and whether there are any rules of the order. As a result, I finally understood all kinds of data query experiments.
1. Coordinate System
PS: I learned opengl es mainly for 2D pasters, so the Z axis is not involved.
Figure 1 shows the world coordinate system of opengl. There is no problem with this. The main reason is that the texture coordinate is the bottom left point. In practice, we can see that the picture on Android should be the rightmost, with the top left as the origin.
I guess the texture is actually an array composed of a set of color points. Because Android UI coordinates are based on the top left, I changed the storage order of color points in the array, the coordinate system is different.
2. Sample Code
public class Filter { protected static final String VERTEX_SHADER = "" + "attribute vec4 position;\n" + "attribute vec4 inputTextureCoordinate;\n" + " \n" + "varying vec2 textureCoordinate;\n" + " \n" + "void main()\n" + "{\n" + " gl_Position = position;\n" + " textureCoordinate = inputTextureCoordinate.xy;\n" + "}"; protected static final String FRAGMENT_SHADER = "" + "varying highp vec2 textureCoordinate;\n" + " \n" + "uniform sampler2D inputImageTexture;\n" + " \n" + "void main()\n" + "{\n" + " gl_FragColor = texture2D(inputImageTexture, textureCoordinate);\n" + "}"; static final float COORD1[] = { -1.0f, -1.0f, 1.0f, -1.0f, -1.0f, 1.0f, 1.0f, 1.0f, }; static final float TEXTURE_COORD1[] = { 0.0f, 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, }; static final float COORD2[] = { -1.0f, 1.0f, -1.0f, -1.0f, 1.0f, 1.0f, 1.0f, -1.0f, }; static final float TEXTURE_COORD2[] = { 0.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 1.0f, 1.0f, }; static final float COORD3[] = { 1.0f, -1.0f, 1.0f, 1.0f, -1.0f, -1.0f, -1.0f, 1.0f, }; static final float TEXTURE_COORD3[] = { 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, }; static final float COORD4[] = { 1.0f, -1.0f, 1.0f, 1.0f, -1.0f, -1.0f, -1.0f, 1.0f, }; static final float TEXTURE_COORD4[] = { 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, }; static final float COORD_REVERSE[] = { 1.0f, -1.0f, 1.0f, 1.0f, -1.0f, -1.0f, -1.0f, 1.0f, }; static final float TEXTURE_COORD_REVERSE[] = { 1.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f, }; static final float COORD_FLIP[] = { 1.0f, -1.0f, 1.0f, 1.0f, -1.0f, -1.0f, -1.0f, 1.0f, }; static final float TEXTURE_COORD_FLIP[] = { 0.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f, 0.0f, }; private String mVertexShader; private String mFragmentShader; private FloatBuffer mCubeBuffer; private FloatBuffer mTextureCubeBuffer; protected int mProgId; protected int mAttribPosition; protected int mAttribTexCoord; protected int mUniformTexture; public Filter() { this(VERTEX_SHADER, FRAGMENT_SHADER); } public Filter(String vertexShader, String fragmentShader) { mVertexShader = vertexShader; mFragmentShader = fragmentShader; } public void init() { loadVertex(); initShader(); GLES20.glBlendFunc(GLES20.GL_ONE, GLES20.GL_ONE_MINUS_SRC_ALPHA); } public void loadVertex() { float[] coord = COORD1; float[] texture_coord = TEXTURE_COORD1; mCubeBuffer = ByteBuffer.allocateDirect(coord.length * 4) .order(ByteOrder.nativeOrder()) .asFloatBuffer(); mCubeBuffer.put(coord).position(0); mTextureCubeBuffer = ByteBuffer.allocateDirect(texture_coord.length * 4) .order(ByteOrder.nativeOrder()) .asFloatBuffer(); mTextureCubeBuffer.put(texture_coord).position(0); } public void initShader() { mProgId = GLHelper.loadProgram(mVertexShader, mFragmentShader); mAttribPosition = GLES20.glGetAttribLocation(mProgId, "position"); mUniformTexture = GLES20.glGetUniformLocation(mProgId, "inputImageTexture"); mAttribTexCoord = GLES20.glGetAttribLocation(mProgId, "inputTextureCoordinate"); } public void drawFrame(int glTextureId) { if (!GLES20.glIsProgram(mProgId)) { initShader(); } GLES20.glUseProgram(mProgId); mCubeBuffer.position(0); GLES20.glVertexAttribPointer(mAttribPosition, 2, GLES20.GL_FLOAT, false, 0, mCubeBuffer); GLES20.glEnableVertexAttribArray(mAttribPosition); mTextureCubeBuffer.position(0); GLES20.glVertexAttribPointer(mAttribTexCoord, 2, GLES20.GL_FLOAT, false, 0, mTextureCubeBuffer); GLES20.glEnableVertexAttribArray(mAttribTexCoord); if (glTextureId != GLHelper.NO_TEXTURE) { GLES20.glActiveTexture(GLES20.GL_TEXTURE0); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, glTextureId); GLES20.glUniform1i(mUniformTexture, 0); } GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4); GLES20.glDisableVertexAttribArray(mAttribPosition); GLES20.glDisableVertexAttribArray(mAttribTexCoord); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0); GLES20.glDisable(GLES20.GL_BLEND); }}
Where
Gles1_gldrawarrays (gles1_gl _ TRIANGLE_STRIP, 0, 4 );
Since openglES itself is a thumbnail of opengl, only triangles can be drawn directly, and other complicated aspects must be composed of triangles. GLES20.GL _ TRIANGLE_STRIP refers to a triangle draw mode, corresponding to this vertex array:
static final float COORD[] = { -1.0f, -1.0f, //1 1.0f, -1.0f, //2 -1.0f, 1.0f, //3 1.0f, 1.0f, //4 };
Actually drawn is a rectangle formed by a triangle composed of vertex 1, 2, and 3 and a triangle composed of vertex 2, 3. If there are more points, and so on (for example, there are five points, it is a pattern consisting of 1, 2, 3, 3, 3, 4, 3, 3, and 5 triangles ). For example:
3. Texture vertex Sequence
The texture points correspond to the points of the world coordinates:
static final float COORD1[] = { -1.0f, -1.0f, 1.0f, -1.0f, -1.0f, 1.0f, 1.0f, 1.0f, }; static final float TEXTURE_COORD1[] = { 0.0f, 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, };
Display result
The middle arrow. opengl will plot the color vertices in the texture to the corresponding global coordinate vertices. The middle vertices will take an average value or something according to certain rules, so we can see that the graph actually displayed is stretched up and down, because the source image is, And in this program
GLES20.glViewport (0, 0, width, height );
The given display area is larger than the width (this involves the ing between opengl world coordinates and screen coordinates, which is not much to be said about with the subject of this Article ).
In fact, as long as the points in the world coordinates and texture coordinate Arrays can be correct, the order is not a problem.
The display effects of the four sets of coordinates in the Code are the same:
static final float COORD1[] = { -1.0f, -1.0f, 1.0f, -1.0f, -1.0f, 1.0f, 1.0f, 1.0f, }; static final float TEXTURE_COORD1[] = { 0.0f, 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, }; static final float COORD2[] = { -1.0f, 1.0f, -1.0f, -1.0f, 1.0f, 1.0f, 1.0f, -1.0f, }; static final float TEXTURE_COORD2[] = { 0.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 1.0f, 1.0f, }; static final float COORD3[] = { 1.0f, -1.0f, 1.0f, 1.0f, -1.0f, -1.0f, -1.0f, 1.0f, }; static final float TEXTURE_COORD3[] = { 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, }; static final float COORD4[] = { 1.0f, -1.0f, 1.0f, 1.0f, -1.0f, -1.0f, -1.0f, 1.0f, }; static final float TEXTURE_COORD4[] = { 1.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, };
If you don't believe it, you can replace it here:
float[] coord = COORD1; float[] texture_coord = TEXTURE_COORD1;
For better understanding, you can even make some tricks, such
static final float COORD_REVERSE[] = { 1.0f, -1.0f, 1.0f, 1.0f, -1.0f, -1.0f, -1.0f, 1.0f, }; static final float TEXTURE_COORD_REVERSE[] = { 1.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f, }; 。。。。。。。。。。。。。。。。 float[] coord = COORD_REVERSE; float[] texture_coord = TEXTURE_COORD_REVERSE;
The result is as follows:
4. Demo Source Code address
Https://github.com/yellowcath/GLCoordDemo.git