Android OpenGL ES (i)----essential knowledge

Source: Internet
Author: User

1. Cell phone's coordinate space

We all know that if we want to draw graphics on the phone, we must understand the coordinate system of the mobile phone. is to map coordinates to the coordinates of the phone's screen.

Figure 1 Basic coordinate system of mobile phone screen


2. OpenGL Basic Graphics

In OpenGL, you can only draw points, lines, and triangles.


The triangle is the most basic figure, because its structure is so stable, it can be seen everywhere, such as the structural members of the bridge, it has three sides to connect its three vertices, if we take off one of the vertices, the rest is a straight line, if we take another point, there is only one point left.


Points and lines can be used for some effects, but only triangles can be used to construct scenes with complex objects and textures. In OpenGL, we put a single point in a group to build a triangle, and then tell OpenGL how to connect the dots. Everything we want to build is defined by dots, lines and triangles, and if we want to build more complex shapes like arches, we need to fit the curve with enough points.


3. Enable data to be accessed by OpenGL

When we compile and run Java code on a simulator or device, it does not run directly on the hardware, but instead, it runs in a special environment, the Dalvik virtual machine. Code running on a virtual machine does not have direct access to the local environment, except through a specific API.


The Dalvik virtual machine also uses a garbage collection mechanism. This means that when a virtual machine detects a variable, an object or other memory fragment is not being used, the memory is freed for reuse, and it can maneuvers memory to improve the efficiency of space usage.


The local environment does not work like this, and it does not expect memory blocks to be moved or automatically freed.


Android is designed to do this because developers do not have to worry about specific CPU or machine architectures or the underlying memory management when developing programs. This usually works well unless OpenGL is required to interact with the local system. OpenGL runs directly on the hardware as a local system library, with no virtual machines, no garbage collection or memory compression.


The Dalvik scenario is one of the main features of Android, but if the code runs inside the virtual machine, how does it communicate with OpenGL? There are two techniques, the first of which is to use the Java local interface JNI, which has been provided by the Android Software Development Department, when invoking the Android.opengl.GLES20 package method, the software development package is actually using JNI in the background to invoke the local system library.


The second technique is to change the way memory is allocated, and Java has a special set of classes that can allocate local memory blocks and copy Java data to local memory. Local memory can be accessed by the local environment and is not managed by the garbage collector.


Figure 2 Transferring data from Dalvik to OpenGL


Example:

Private float[] rectangle={

-0.5f,-0.5f,

0.5f,0.5f,

-0.5f,0.5f,


-0.5f,-0.5f,

0.5f,-0.5f,

0.5f,0.5f

private static final int bytes_per_float=4;

Private final Floatbuffer VertexData;

Vertexdata=bytebuffer

. Allocatedirect (Rectangle*bytes_per_float)

. Order (Byteorder.nativeorder ())

. Asfloatbuffer ();

Vertexdata.put (Rectangle);


This adds an integer constant and a floatbuffer type variable, a Java floating-point number with 32 bits of precision, and a byte with only 8 bits of precision, which may seem obvious, each floating-point number is 4 bytes, and Floatbuffer is used to store data in local memory.


First, we use Bytebuffer.allocatedirect () to allocate a block of local memory that is not managed by the garbage collector. This method needs to know how many bytes of memory to allocate, because the vertices are stored in a floating-point array, and each floating-point number has 4 bytes, so the size of this chunk of memory should be rectangle*bytes_per_float.


The next line tells the byte buffer to organize its contents in local byte order. Local byte order refers to when a value occupies more than one byte, such as a 32-bit integer, the bytes are arranged from the most important bit to the least important bit or in the reverse order. This can be considered to be similar to writing a number from left to right or right to left. Knowing that this sort is not important, it is important to use the same ordering as a platform, and calling order (Byteorder.nativeorder ()) can guarantee this.


Finally, instead of directly manipulating individual bytes, we want to use floating-point numbers, so call Asfloatbuffer () to get a Floatbuffer class instance that reflects the underlying byte. You can then call Vertexdata.put (rectangle) to copy the data from the Dalvik memory to the local memory. When the process is finished, the memory is freed, so we usually don't care about it. However, if you create a lot of bytebuffer when you write your code, or you may want to learn some of the techniques of fragmentation and memory management as the program runs and generates a lot of bytebuffer.


4. Introduction of OpenGL Pipeline

Before drawing onto the screen, it needs to be passed in the OpenGL pipeline, which requires the use of a shader called shaders that tells the graphics processing unit how to draw the data. There are two types of shaders that need to be defined before any content is drawn to the screen.


Vertex shader: Generates the final position of each vertex, which is executed once for each vertex, and OpenGL can assemble the collection of these visible vertices into points, lines, and triangles once the final position is determined.


Fragment shader: Produces the final color for each fragment that makes up a point, line, or triangle, for each fragment, it executes once, and a fragment is a small, single-colored rectangular area, similar to a pixel on a computer screen.


Once the final color is generated, OpenGL writes them to a block of memory called the Framebuffer, and then Android displays the frame buffer to the screen.


Figure 3 OpenGL Pipeline Overview

5. Creating a vertex shader

Create a raw file in an Android project and put the shader in the folder for easy reference.


Example SIMPLE_VERTEX_SHADER.GLSL:

Attribute Vec4 a_position;

void Main ()

{

Gl_position=a_position;

}

These shaders are defined using GLSL, GLSL is the shader language of OpenGL, and the syntax structure of this coloring language is similar to the C language.


The vertex shader is called once for each single point we define, and when it is called, it receives the position of the current vertex in the A_position attribute, which is defined as the VEC4 type.


A vec4 is a vector containing 4 components; In the context of a position, it can be assumed that the 4 components are x, Y, Z and w coordinates, that x, Y and D correspond to a three-dimensional position, that W is a special coordinate, and that the W coordinates are explained later, and now temporarily skipped. If not specified, OpenGL, by default, sets the first three coordinates of the vector to 0 and sets the last coordinate to 1.


A vertex can have several properties, such as color and position. The keyword "attribute" is the means by which these attributes are put into the shader.


After that, you can define main (), which is the main entry point for the shader, and it does this by copying the previously defined position to the specified output variable gl_position; This shader must assign a value to Gl_position; OpenGL will Gl_ The location stored in the position is treated as the final position of the vertex, and the vertices are assembled into points, lines, and triangles.


6. Create a fragment shader

Rasterization Technology


The display of a mobile device is made up of thousands of small, independent components called pixels, each of which has the ability to display one of millions of different color ranges. However, this is actually a visual technique: most displays do not really create millions of colors, so each pixel is usually made up of three separate sub-constructs that emit red, green, and blue light, because each pixel is very small, and the human eye mixes the red, green, and blue light together, Thus creating a huge range of colors, and putting enough individual pixels together to show a variety of colors.


OpenGL uses the "rasterization" process to break down each point, line, and triangle into a large number of small fragments that can be mapped to a mobile device display to produce an image. These fragments are similar to the pixels on the display, each containing a single solid color. To represent the color, each fragment has 4 components: red, green, blue to represent color, alpha component to represent transparency,


Figure 4 Optical rasterization: Creating Fragments


Write code example SIMPLE_FRAGMENT_SHADER.GLSL:

Precision Mediump float;

Uniform VEC4 U_color;

void Main ()

{

Gl_fragcolor=u_color;

}

In this fragment shader, the first line of code at the top of the file defines the default precision for all floating-point data types. This is like choosing a floating-point or double-precision floating-point number in Java code.


You can choose LOWP,MEDIUMP,HIGHP, which correspond to low precision, medium accuracy and high accuracy, however, only some hardware implementations support the use of HIGHP in the fragment shader.


Careful reading can find out why vertex shaders do not define precision, but vertex shaders can also define precision, but for one vertex precision is the most important, and the OpenGL designer decides to set the precision of the vertex shader to the highest-HIGHP by default.


As you might have guessed, high-precision data types are more accurate, but this is at the expense of performance; for fragment shaders, for maximum compatibility, Mediump is selected, which is based on the tradeoff between speed and quality.


The remainder of this fragment shader is the same as the vertex shader defined earlier. But this time we're going to pass a uniform, called U_color. It is not like a property, each vertex is set to one; a uniform will let each vertex use the same value unless we change it at the same time. The U_color is also a four-component vector, as is the position used by the location in the fixed-point shader, but in the context of the color, the four components correspond to red, green, blue, and Alpha.


We then define main (), which is the entry point for this shader, which copies the colors we define in uniform to that particular output variable---gl_fragcolor. The shader must assign a value to Gl_fragcolor, which OpenGL uses as the final color of the current fragment.


Remember a sentence to fully understand the fragment shader: The primary purpose of the fragment shader is to tell the GPU what the final color of each fragment should be.


Remember a sentence and get a full picture of the vertex Shader: The primary purpose of the vertex shader is to determine the final position of each vertex.

Android OpenGL ES (i)----essential knowledge

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.