How to use OpenGL ES Media Effects in Android
The Android media effect framework allows developers to easily apply a variety of impressive visual effects to photos or videos. As a media effect framework, it uses GPU to process image processing. It only receives OpenGL textures as input. In this tutorial, you will learn how to use OpenGL ES2.0 to convert image resources to textures, and how to use the Framework to apply different processing effects to images.
Preparation
To start this tutorial, you must:
1. an IDE that supports Android development. If you do not have one, you can download the latest version of Android studio from Android Developer website.
2. an Android phone running on Android and GPU supports OpenGL ES2.0
3. Basic knowledge of OpenGL
Set the OpenGL ES environment to create GLSurfaceView
To display OpenGL graphics, you need to use the GLSurfaceView class. just like any other View subclass, you can add it to your Activity or Fragment, create an instance by defining it in the layout xml file or creating it in code.
In this tutorial, we use GLSurfaceView as the unique View in our Activity. Therefore, for convenience, we create a GLSurfaceView instance in the Code and pass it into setContentView, this will fill your entire cell phone screen. The onCreate method in the Activity is as follows:
protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); GLSurfaceView view = new GLSurfaceView(this); setContentView(view);}
Because the media effect framework only supports OpenGL ES2.0 and later versions, 2 is input in the setEGLContextClientVersion method;
view.setEGLContextClientVersion(2);
To ensure that GLSurfaceView is rendered only when necessary, we set it in the setRenderMode method:
view.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
Create Renderer
Renderer is responsible for rendering the content in GLSurfaceView.
Create the class implementation interface GLSurfaceView. Renderer. Here we plan to name this class as javastsrenderer, add the constructor, and override the abstract method in the interface, as shown below:
public class EffectsRenderer implements GLSurfaceView.Renderer { public EffectsRenderer(Context context){ super(); } @Override public void onSurfaceCreated(GL10 gl, EGLConfig config) { } @Override public void onSurfaceChanged(GL10 gl, int width, int height) { } @Override public void onDrawFrame(GL10 gl) { }}
Return to the Activity and call the setRenderer method so that GLSurfaceView can use the Renderer we created:
view.setRenderer(new EffectsRenderer(this));
Write a Manifest File
If you want to publish your App to the Google store, add the following statement to the AndroidManifest. xml file:
This ensures that your app can only be installed on devices that support OpenGL ES2.0. Now the OpenGL environment is ready.
Create an OpenGL plane definition Vertex
GLSurfaceView cannot directly display a photo. The photo should first be converted to a texture and applied on OpenGL square. In this tutorial, I will create a 2D plane with four vertices. To make it simple, I will use a rectangle. Now, create a new class Square and use it to represent the shape.
public class Square {}
By default, the origin in the coordinate system of OpenGL is in the center, so the coordinates of the four corners can be expressed:
Lower left corner: (-1,-1) lower right corner :( 1,-1) upper right corner :( 1, 1) upper left corner :(-1, 1)
All objects drawn using OpenGL should be determined by triangles. To draw a square, we need two triangles with a public edge, which means the coordinates of these triangles should be:
Triangle 1: (-1,-1), (1,-1), and (-1, 1) triangle 2: (1,-1), (-1, 1), and (1, 1)
Create a float array to represent these vertices:
private float vertices[] = { -1f, -1f, 1f, -1f, -1f, 1f, 1f, 1f,};
To locate the texture on square, you need to determine the texture vertex coordinates and create another array to represent the texture vertex coordinates:
private float textureVertices[] = { 0f,1f, 1f,1f, 0f,0f, 1f,0f};
Create a buffer
These coordinate arrays should be converted to buffer characters (byte buffer). Before OpenGL can be used, we will define them as follows:
private FloatBuffer verticesBuffer;private FloatBuffer textureBuffer;
Initialize These buffers in the initializeBuffers method: ByteBuffer. allocateDirect is used to create the buffer. Because float is 4 bytes, the length of the byte array we need should be four times that of float.
The ByteBuffer. nativeOrder method is used to define the byte sequence on the underlying local platform. Use the asFloatBuffer method to convert ByteBuffer to FloatBuffer. After the FloatBuffer is created, we call the put Method to put the float array into the buffer. Finally, call the position method to ensure that we read data from the beginning of the buffer.
private void initializeBuffers(){ ByteBuffer buff = ByteBuffer.allocateDirect(vertices.length * 4); buff.order(ByteOrder.nativeOrder()); verticesBuffer = buff.asFloatBuffer(); verticesBuffer.put(vertices); verticesBuffer.position(0); buff = ByteBuffer.allocateDirect(textureVertices.length * 4); buff.order(ByteOrder.nativeOrder()); textureBuffer = buff.asFloatBuffer(); textureBuffer.put(textureVertices); textureBuffer.position(0);}
Create a shader
The shader is just a simple C program running on each separate vertex in the GPU. In this tutorial, we use two kinds of shader: the vertex shader and the fragment shader.
Code of the vertex shader:
attribute vec4 aPosition; attribute vec2 aTexPosition; varying vec2 vTexPosition; void main() { gl_Position = aPosition; vTexPosition = aTexPosition; };
Code of the fragment shader
precision mediump float; uniform sampler2D uTexture; varying vec2 vTexPosition; void main() { gl_FragColor = texture2D(uTexture, vTexPosition); };
If you understand OpenGL, this code is familiar to you. If you cannot understand this code, you can refer to OpenGL documentation. Here is a concise explanation:
The vertex shader is responsible for drawing a single vertex. APosition is a variable bound to FloatBuffer, containing the coordinates of These vertices. Similarly, aTexPosition is a variable bound to FloatBuffer, containing the coordinates of the texture. Gl_Position is a variable created in OpenGL, representing the position of each vertex. vTexPosition is an array variable and its value is passed to the fragment shader.
In this tutorial, the fragment shader is responsible for square coloring. It uses the texture2D method to pick up the color from the texture, and uses the variable gl_FragColor created in OpenGL to assign the color to the clip.
In this class, the code of the shader should be converted to String.
private final String vertexShaderCode = attribute vec4 aPosition; + attribute vec2 aTexPosition; + varying vec2 vTexPosition; + void main() { + gl_Position = aPosition; + vTexPosition = aTexPosition; + };private final String fragmentShaderCode = precision mediump float; + uniform sampler2D uTexture; + varying vec2 vTexPosition; + void main() { + gl_FragColor = texture2D(uTexture, vTexPosition); + };
Create a program
Create a new method initializeProgram to create an OpenGL program for compiling and linking the shader.
Use glCreateShader to create a shader object and return a pointer in the form of an int. To create a vertex shader, pass GL_VERTEX_SHADER to it. Similarly, in order to create a part shader, pass GL_FRAGMENT_SHADER to it. The following uses the glShaderSource method to associate the corresponding shader code to the shader. Use glCompileShader to compile the shadow code.
After compiling the code of the shader, create a new program glCreateProgram, which is similar to glCreateShader and returns a pointer in the form of int. Call the glAttachShader method to attach the shader to the program. Finally, call the glLinkProgram for the link.
Code:
private int vertexShader;private int fragmentShader;private int program;private void initializeProgram(){ vertexShader = GLES20.glCreateShader(GLES20.GL_VERTEX_SHADER); GLES20.glShaderSource(vertexShader, vertexShaderCode); GLES20.glCompileShader(vertexShader); fragmentShader = GLES20.glCreateShader(GLES20.GL_FRAGMENT_SHADER); GLES20.glShaderSource(fragmentShader, fragmentShaderCode); GLES20.glCompileShader(fragmentShader); program = GLES20.glCreateProgram(); GLES20.glAttachShader(program, vertexShader); GLES20.glAttachShader(program, fragmentShader); GLES20.glLinkProgram(program);}
You may find that the OpenGL method (started with gl) is in the GLES20 class, because we use OpenGL ES2.0. If we use a higher version, these classes are used: GLES30 and GLES31.
Draw shape
Now define the draw method to draw with the vertices and pasters we previously defined.
The following is what you need to do:
1. Use the glBindFramebuffer method to create a frame buffer object (FBO)
2. Call glUseProgram to create a program, as mentioned earlier
3. Pass GL_BLEND to the glDisable method to disable color mixing during rendering.
4. Call glGetAttribLocation to obtain the handle of the variables aPosition and aTexPosition.
5. Use glVertexAttribPointer to connect the handles of aPosition and aTexPosition to their respective verticesBuffer and textureBuffer
6. Use the glBindTexture method to bind the texture (passed as a parameter of the draw method) to the fragment shader.
7. Call the glClear method to clear the GLSurfaceView content
8. Finally, use the glDrawArrays method to draw two triangles (that is, squares)
Code:
public void draw(int texture){ GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0); GLES20.glUseProgram(program); GLES20.glDisable(GLES20.GL_BLEND); int positionHandle = GLES20.glGetAttribLocation(program, aPosition); int textureHandle = GLES20.glGetUniformLocation(program, uTexture); int texturePositionHandle = GLES20.glGetAttribLocation(program, aTexPosition); GLES20.glVertexAttribPointer(texturePositionHandle, 2, GLES20.GL_FLOAT, false, 0, textureBuffer); GLES20.glEnableVertexAttribArray(texturePositionHandle); GLES20.glActiveTexture(GLES20.GL_TEXTURE0); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texture); GLES20.glUniform1i(textureHandle, 0); GLES20.glVertexAttribPointer(positionHandle, 2, GLES20.GL_FLOAT, false, 0, verticesBuffer); GLES20.glEnableVertexAttribArray(positionHandle); GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT); GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);}
Add the initialization method to the constructor:
public Square(){ initializeBuffers(); initializeProgram();}
Rendering OpenGL planes and textures
Now our Renderer has nothing to do, and we need to change it to render the plane we created above.
First, let's create a Bitmap and add a photo to the res/drawablefolder. I named the watermark as forest.jpg and used BitmapFactory to convert the photo to Bitmap. In addition, save the size of the photo.
The constructor that changes EffectsRenderer is as follows,
private Bitmap photo;private int photoWidth, photoHeight;public EffectsRenderer(Context context){ super(); photo = BitmapFactory.decodeResource(context.getResources(), R.drawable.forest); photoWidth = photo.getWidth(); photoHeight = photo.getHeight();}
Create a new method generateSquare, convert Bitmap to texture, and initialize the Square object. You also need an array to save references to the texture, and use glGenTextures to initialize the array, glBindTexture method to activate texture at position 0.
Now, call glTexParameteri to set different levels to determine how the texture is rendered.
Set GL_TEXTURE_MIN_FILTER (correction function) and GL_TEXTURE_MAG_FILTER (amplification function) to GL_LINEAR to ensure that the image is smooth when it is stretched.
Set GL_TEXTURE_WRAP_S and GL_TEXTURE_WRAP_T to GL_CLAMP_TO_EDGE to ensure that the texture will not be repeated.
Finally, call the texImage2D method to place Bitmap in the texture. The implementation method is as follows:
private int textures[] = new int[2];private Square square;private void generateSquare(){ GLES20.glGenTextures(2, textures, 0); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textures[0]); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE); GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE); GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, photo, 0); square = new Square();}
When the size of GLSurfaceView changes, the onSurfaceChanged method is called. In this case, we need to call glViewPort to confirm the new size. Call glClearColor to make it black, and then call generateSquare to reinitialize the texture and plane.
@Overridepublic void onSurfaceChanged(GL10 gl, int width, int height) { GLES20.glViewport(0,0,width, height); GLES20.glClearColor(0,0,0,1); generateSquare();}
Finally, call the draw method in onDrawFrame:
@Overridepublic void onDrawFrame(GL10 gl) { square.draw(textures[0]);}
Finally, you can run a program to render the picture you selected on your phone:
Media effect framework
The complex code we have written so far is used to prepare for the use of media effects. Now it is time to use this framework and add it to your own Renderer class:
private EffectContext effectContext;private Effect effect;
Use javastcontext. createWithCurrentGlContext to initialize javastcontext, which is responsible for managing the visual effects of an internal OpenGL context. To optimize performance, it should be called only once. Add the following code to the beginning of your onDrawFrame:
if(effectContext==null) { effectContext = EffectContext.createWithCurrentGlContext();}
Creating an Effect is very simple. You can use effectContext to create an Effect object. Once the Effect object is available, you can call the apply method to pass a reference to the original texture, in this example, textures [0] is used. With the blank texture object, textures [1] is used in this example. After the apply method is called, textures [1] will contain the results of Effect.
For example, we use the grayscale effect. Here is the code:
private void grayScaleEffect(){ EffectFactory factory = effectContext.getFactory(); effect = factory.createEffect(EffectFactory.EFFECT_GRAYSCALE); effect.apply(textures[0], photoWidth, photoHeight, textures[1]);}
Call this method in onDrawFrame and pass textures [1] to the draw method of Square:
@Overridepublic void onDrawFrame(GL10 gl) { if(effectContext==null) { effectContext = EffectContext.createWithCurrentGlContext(); } if(effect!=null){ effect.release(); } grayScaleEffect(); square.draw(textures[1]);}
The release method is used to release resources held by Effect. When you run an app, you can see the following results:
You can apply the same code to a documentary effect (documentary ),
private void documentaryEffect(){ EffectFactory factory = effectContext.getFactory(); effect = factory.createEffect(EffectFactory.EFFECT_DOCUMENTARY); effect.apply(textures[0], photoWidth, photoHeight, textures[1]);}
It looks like this
Some effects require parameters, such as the effect of brightness adjustment. The brightness parameter is a float value. You can use the setParameter method to change the parameter value, just like the following code:
private void brightnessEffect(){ EffectFactory factory = effectContext.getFactory(); effect = factory.createEffect(EffectFactory.EFFECT_BRIGHTNESS); effect.setParameter(brightness, 2f); effect.apply(textures[0], photoWidth, photoHeight, textures[1]);}
The result is as follows:
Summary
In this tutorial, you have learned how to use the media effect framework to apply various effects to your photos. In doing so, you also learned how to draw a Plane Using OpenGL ES 2.0 and apply various textures.
This framework can be applied to photos and videos. For videos, you only need to apply the effect method to the onDrawFrame method of each frame.
You have seen three effects in this tutorial. There are many effects in this framework. For more information, see Android Developer's website.