Fun Android Camera Development (III): First release in China --- use GLSurfaceView to preview the basic Camera demo of Camera

Source: Internet
Author: User

GLSurfaceView is a class in OpenGL and can also be used to preview Camera. It is unique in previewing Camera. Where is the uniqueness? When Surfaceview is powerless and hard to solve, GLSurfaceView can only be used. It can truly separate the data and display of Camera, for example, Camera only enables preview and does not show that this is all a small dish. The built-in Camera source code of Android4.0 is previewed by SurfaceView, but GLSurfaceView is replaced by 4.2 for preview. Now we use our own TextureView in 4.4, so we can explore the intention of adding a new TextureView.

Although the Camera source code of Android4.2 is previewed with GLSurfaceView, a large number of packages are encapsulated and encapsulated. Because it is OpenGL, it is really a good idea of Alibaba Cloud. I don't have high requirements. I just want to figure out how GLSurfaceView works on previewing Camera. After some times, Baidu did not find anything available, and Google did not find it available in a large circle. However, many people use GLSurfaceView and Surfaceview to preview the Camera at the same time. Surfaceview is used to preview the data and a GLSurfaceView layer is provided to draw some information. I had no choice but to find out what I had to do. What I got was that I could take pictures and get data, but the screen was not displayed on either a whiteboard or a blackboard. Later, I finally found an available link in stackoverflow! By referring to this link, I tried it again for a whole day. It takes so much time to understand the basic rendering process of OpenGL ES2.0, which is slightly different from that of simple OpenGL. The source code below:

1. CameraGLSurfaceView. java inherits GLSurfaceView and implements two interfaces.

package org.yanzi.camera.preview;import javax.microedition.khronos.egl.EGLConfig;import javax.microedition.khronos.opengles.GL10;import org.yanzi.camera.CameraInterface;import android.content.Context;import android.graphics.SurfaceTexture;import android.opengl.GLES11Ext;import android.opengl.GLES20;import android.opengl.GLSurfaceView;import android.opengl.GLSurfaceView.Renderer;import android.util.AttributeSet;import android.util.Log;public class CameraGLSurfaceView extends GLSurfaceView implements Renderer, SurfaceTexture.OnFrameAvailableListener {private static final String TAG = "yanzi";Context mContext;SurfaceTexture mSurface;int mTextureID = -1;DirectDrawer mDirectDrawer;public CameraGLSurfaceView(Context context, AttributeSet attrs) {super(context, attrs);// TODO Auto-generated constructor stubmContext = context;setEGLContextClientVersion(2);setRenderer(this);setRenderMode(RENDERMODE_WHEN_DIRTY);}@Overridepublic void onSurfaceCreated(GL10 gl, EGLConfig config) {// TODO Auto-generated method stubLog.i(TAG, "onSurfaceCreated...");mTextureID = createTextureID();mSurface = new SurfaceTexture(mTextureID);mSurface.setOnFrameAvailableListener(this);mDirectDrawer = new DirectDrawer(mTextureID);CameraInterface.getInstance().doOpenCamera(null);}@Overridepublic void onSurfaceChanged(GL10 gl, int width, int height) {// TODO Auto-generated method stubLog.i(TAG, "onSurfaceChanged...");GLES20.glViewport(0, 0, width, height);if(!CameraInterface.getInstance().isPreviewing()){CameraInterface.getInstance().doStartPreview(mSurface, 1.33f);}}@Overridepublic void onDrawFrame(GL10 gl) {// TODO Auto-generated method stubLog.i(TAG, "onDrawFrame...");GLES20.glClearColor(1.0f, 1.0f, 1.0f, 1.0f);GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);mSurface.updateTexImage();float[] mtx = new float[16];mSurface.getTransformMatrix(mtx);mDirectDrawer.draw(mtx);}@Overridepublic void onPause() {// TODO Auto-generated method stubsuper.onPause();CameraInterface.getInstance().doStopCamera();}private int createTextureID(){int[] texture = new int[1];GLES20.glGenTextures(1, texture, 0);GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture[0]);GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,GL10.GL_TEXTURE_MIN_FILTER,GL10.GL_LINEAR);        GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);return texture[0];}public SurfaceTexture _getSurfaceTexture(){return mSurface;}@Overridepublic void onFrameAvailable(SurfaceTexture surfaceTexture) {// TODO Auto-generated method stubLog.i(TAG, "onFrameAvailable...");this.requestRender();}}
This class is briefly described as follows:

1. The Renderer interface has three callbacks: onSurfaceCreated () onSurfaceChanged () onDrawFrame (). The GLSurfaceView version is set in onSurfaceCreated: setEGLContextClientVersion (2 ); without this setting, nothing can be drawn, because Android supports OpenGL ES1.1 and 2.0 and the latest 3.0, and the versions differ greatly. He does not know which version of api is used for rendering. After setting setRenderer (this);, set its mode to RENDERMODE_WHEN_DIRTY. This is also critical, depending on the api:

When renderMode is RENDERMODE_CONTINUOUSLY, the renderer is called repeatedly to re-render the scene. When renderMode is invalid, the renderer only rendered when the surface is created, or whenrequestRenderIs called. Defaults to RENDERMODE_CONTINUOUSLY.

Using RENDERMODE_WHEN_DIRTY can improve battery life and overall system performance by allowing the GPU and CPU to idle when the view does not need to be updated.

The general idea is that the RENDERMODE_CONTINUOUSLY mode will always Render. If it is set to RENDERMODE_WHEN_DIRTY, rendered is only available when there is data, or the requestRender of GLSurfaceView is actively called. the default mode is continuous mode. Obviously, Camera is suitable for dirty mode, with 30 frames per second, and is rendered when data exists.

2. RENDERMODE_WHEN_DIRTY tells GLSurfaceView when to Render, that is, when to enter the onDrawFrame () function. The SurfaceTexture. OnFrameAvailableListener interface does this. When there is data

Public void onFrameAvailable (SurfaceTexture surfaceTexture ){
// TODO Auto-generated method stub
Log. I (TAG, "onFrameAvailable ...");
This. requestRender ();
}

Here, run requestRender ().

3. Some examples of OpenGL ES on the Internet are SurfaceTexture. OnFrameAvailableListener in the Activity. In fact, this interface does not matter. No matter who implements it, the key is what is done in the callback.

4. Compared with TextureView, we can see that in TextureView preview, SurfaceTextureListener is automatically created because of SurfaceTexture. However, in GLSurfaceView, you must manually create and bind a texture ID.

5. In this article, open Camera in onSurfaceCreated () and enable preview in onSurfaceChanged (). The default ratio is 1.33. The reason is that compared to the first two previews, SurfaceTexture takes some time to create. If you want to launch the preview by the Activity, you need GLSurfaceView to use Handler to pass the created SurfaceTexture to the Activity.


2. DirectDrawer. java is critical to draw SurfaceTexture content on the screen.

package org.yanzi.camera.preview;import java.nio.ByteBuffer;import java.nio.ByteOrder;import java.nio.FloatBuffer;import java.nio.ShortBuffer;import android.opengl.GLES11Ext;import android.opengl.GLES20;import android.opengl.Matrix;public class DirectDrawer {private final String vertexShaderCode =            "attribute vec4 vPosition;" +            "attribute vec2 inputTextureCoordinate;" +            "varying vec2 textureCoordinate;" +            "void main()" +            "{"+                "gl_Position = vPosition;"+                "textureCoordinate = inputTextureCoordinate;" +            "}";    private final String fragmentShaderCode =            "#extension GL_OES_EGL_image_external : require\n"+            "precision mediump float;" +            "varying vec2 textureCoordinate;\n" +            "uniform samplerExternalOES s_texture;\n" +            "void main() {" +            "  gl_FragColor = texture2D( s_texture, textureCoordinate );\n" +            "}";    private FloatBuffer vertexBuffer, textureVerticesBuffer;    private ShortBuffer drawListBuffer;    private final int mProgram;    private int mPositionHandle;    private int mTextureCoordHandle;    private short drawOrder[] = { 0, 1, 2, 0, 2, 3 }; // order to draw vertices    // number of coordinates per vertex in this array    private static final int COORDS_PER_VERTEX = 2;    private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex    static float squareCoords[] = {       -1.0f,  1.0f,       -1.0f, -1.0f,        1.0f, -1.0f,        1.0f,  1.0f,    };    static float textureVertices[] = {        0.0f, 1.0f,        1.0f, 1.0f,        1.0f, 0.0f,        0.0f, 0.0f,    };    private int texture;    public DirectDrawer(int texture)    {        this.texture = texture;        // initialize vertex byte buffer for shape coordinates        ByteBuffer bb = ByteBuffer.allocateDirect(squareCoords.length * 4);        bb.order(ByteOrder.nativeOrder());        vertexBuffer = bb.asFloatBuffer();        vertexBuffer.put(squareCoords);        vertexBuffer.position(0);        // initialize byte buffer for the draw list        ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2);        dlb.order(ByteOrder.nativeOrder());        drawListBuffer = dlb.asShortBuffer();        drawListBuffer.put(drawOrder);        drawListBuffer.position(0);        ByteBuffer bb2 = ByteBuffer.allocateDirect(textureVertices.length * 4);        bb2.order(ByteOrder.nativeOrder());        textureVerticesBuffer = bb2.asFloatBuffer();        textureVerticesBuffer.put(textureVertices);        textureVerticesBuffer.position(0);        int vertexShader    = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);        int fragmentShader  = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);        mProgram = GLES20.glCreateProgram();             // create empty OpenGL ES Program        GLES20.glAttachShader(mProgram, vertexShader);   // add the vertex shader to program        GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program        GLES20.glLinkProgram(mProgram);                  // creates OpenGL ES program executables    }    public void draw(float[] mtx)    {        GLES20.glUseProgram(mProgram);        GLES20.glActiveTexture(GLES20.GL_TEXTURE0);        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texture);        // get handle to vertex shader's vPosition member        mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");        // Enable a handle to the triangle vertices        GLES20.glEnableVertexAttribArray(mPositionHandle);        // Prepare the <insert shape here> coordinate data        GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, vertexBuffer);        mTextureCoordHandle = GLES20.glGetAttribLocation(mProgram, "inputTextureCoordinate");        GLES20.glEnableVertexAttribArray(mTextureCoordHandle);        //        textureVerticesBuffer.clear();//        textureVerticesBuffer.put( transformTextureCoordinates( textureVertices, mtx ));//        textureVerticesBuffer.position(0);        GLES20.glVertexAttribPointer(mTextureCoordHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT, false, vertexStride, textureVerticesBuffer);        GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length, GLES20.GL_UNSIGNED_SHORT, drawListBuffer);        // Disable vertex array        GLES20.glDisableVertexAttribArray(mPositionHandle);        GLES20.glDisableVertexAttribArray(mTextureCoordHandle);    }        private  int loadShader(int type, String shaderCode){        // create a vertex shader type (GLES20.GL_VERTEX_SHADER)        // or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)        int shader = GLES20.glCreateShader(type);        // add the source code to the shader and compile it        GLES20.glShaderSource(shader, shaderCode);        GLES20.glCompileShader(shader);        return shader;    }    private float[] transformTextureCoordinates( float[] coords, float[] matrix)    {                 float[] result = new float[ coords.length ];               float[] vt = new float[4];             for ( int i = 0 ; i < coords.length ; i += 2 ) {           float[] v = { coords[i], coords[i+1], 0 , 1  };           Matrix.multiplyMV(vt, 0, matrix, 0, v, 0);           result[i] = vt[0];           result[i+1] = vt[1];       }       return result;    }}

3. With the above two classes, you can complete 95% of the work. You can regard GLSurfaceView as having a lifecycle. Disable Camera in onPause and repeat two methods in Activity:

@Overrideprotected void onResume() {// TODO Auto-generated method stubsuper.onResume();glSurfaceView.bringToFront();}@Overrideprotected void onPause() {// TODO Auto-generated method stubsuper.onPause();glSurfaceView.onPause();}
This glSurfaceView. bringToFront (); in fact, it is not written. Write the custom GLSurfaceView in the layout and it will be OK:

    <FrameLayout        android:layout_width="wrap_content"        android:layout_height="wrap_content" >        <org.yanzi.camera.preview.CameraGLSurfaceView            android:id="@+id/camera_textureview"            android:layout_width="0dip"            android:layout_height="0dip" />    </FrameLayout>
CameraActivity is only responsible for the UI, CameraGLSurfaceView is responsible for opening Camera, previewing, and calling draw () in DirectDrawer to draw. No other code is available.

Note:

1,In onDrawFrame (), if you do not call mDirectDrawer. draw (CTX); nothing is displayed !!!This is a special feature of GLSurfaceView. Why? Because GLSurfaceView is not Android, Surfaceview and TextureView are. So you have to follow the OpenGL ES process.

2. mDirectDrawer. the Buffer obtained in the draw (SVD) is too explicit at present. It seems that there is a Request buffer. Instead, a texture ID is generated based on the SurfaceTexture created in GLSurfaceView. This texture ID is bound to SurfaceTexture, DirectDrawer, and SurfaceTexture as the rendering carrier.

3. the following code is provided to solve the problem:

@Overridepublic void onDrawFrame(GL10 gl){    float[] mtx = new float[16];    mSurface.updateTexImage();    mSurface.getTransformMatrix(mtx);        mDirectVideo.draw(mtx);}
 private float[] transformTextureCoordinates( float[] coords, float[] matrix) {              float[] result = new float[ coords.length ];            float[] vt = new float[4];          for ( int i = 0 ; i < coords.length ; i += 2 ) {        float[] v = { coords[i], coords[i+1], 0 , 1  };        Matrix.multiplyMV(vt, 0, matrix, 0, v, 0);        result[i] = vt[0];        result[i+1] = vt[1];    }    return result; }
textureVerticesBuffer.clear();textureVerticesBuffer.put( transformTextureCoordinates( textureVertices, mtx ));textureVerticesBuffer.position(0);
I have integrated all the code into this demo, but it is used in the draw () method. The reason is that, after use, the preview image is distorted, rather than OK. The above code is used to obtain the SurfaceTexture transformation matrix: mSurface. getTransformMatrix

This matrix is then passed to draw (), and a change is made to textureVerticesBuffer during draw, and then painted.

If this matrix transformation effect is not added:


In order to use the transform matrix, we can't say how distorted it is, but it is enough to show that OpenGL ES is powerful in rendering, that is, it sets a matrix and does not need to be processed by one frame, you can get different display effects.




------------------------------- This article is original. For more information, see the author yanzi1225627.

Version: playcamera_v3.0.0[2014-6-222.16.zip

CSDN download link: http://download.csdn.net/detail/yanzi1225627/7547263

Baidu cloud Disk:

Appendix OpenGL ES concise Tutorial: http://www.apkbus.com/android-20427-1-1.html


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.