IOS: Camera Feature Analysis and 3_OPENGL Special Effects

Source: Internet
Author: User
Tags passthrough

IOS: Camera Feature Analysis and 3_OPENGL Special Effects

I wanted to end the introduction of the Camera software with two sections. Later I found that OpenGL was not introduced yet, so I added this article.

This article mainly describes several aspects:

(1) video interface OPENGL display

(2) Video Real-time special effect Processing

(3) Video proportional scaling and rotation, for example, proportional scaling and

This part is not easy to talk about. There is too much design knowledge, especially some professional knowledge of OpenGL. It is obviously not scientific to popularize OpenGL through a blog, so I can only understand one process, as for what is going on in it, please refer to this OpenGL book. I want to write some OpenGL blogs after these blogs are completed.

Our entire process is to first get the data stream during video shooting from AVCaptureSession, then the special effect processing (for the special effect, refer to another Image & Animation column), and then initialize OpenGL to start texture textures.

(1) how to obtain the video data stream?

 

-(Void) captureOutput :( AVCaptureOutput *) captureOutput didOutputSampleBuffer :( CMSampleBufferRef) sampleBuffer fromConnection :( AVCaptureConnection *) connection

{

If (videooutput = captureOutput ){

OSStatus err = CMBufferQueueEnqueue (previewBufferQueue, sampleBuffer );

If (! Err ){

Dispatch_async (dispatch_get_main_queue (), ^ {

CMSampleBufferRef sbuf = (CMSampleBufferRef) CMBufferQueueDequeueAndRetain (previewBufferQueue );

If (sbuf ){

CVImageBufferRef pixBuf = CMSampleBufferGetImageBuffer (sbuf );

If (when tflag ){

Special Effect Processing

}

OpenGL texture display

CFRelease (sbuf );

}

});

}

}

}

After AVCaptureSession Initialization is complete, we can set a callback method in which we can conveniently obtain the image data to be processed.

(2) how to process special effects of images

This is a very complex content. I wrote another blog specifically for this:

This involves various image processing algorithms, neon, assembly optimization, and use of ARM internal registers.

Here we will only explain how to convert ImageBuffer to RGBA pixel:

 

Unsigned char * pixel = (unsigned char *) CVPixelBufferGetBaseAddress (pixelBuffer );

Here, pixel stores the RGBA pixel value of the image.

(3) OpenGL texture textures

 

3.1 // you need to reconstruct the CAEAGLLayer when using Opengles.

+ (Class) layerClass

{

Return [CAEAGLLayer class];

}

3.2 // set CAEAGLLayer

CAEAGLLayer * eaglLayer = (CAEAGLLayer *) self. layer;

3.3 set attributes of the CAEAGLLayer RGBA8

3.4 // use opengl2.0 to create an image rendering context

OglContext = [[EAGLContext alloc] initWithAPI: kEAGLRenderingAPIOpenGLES2];

3.5 // set oglContext to the current context

3.2 // GL_DEPTH_TEST: used to enable the function of updating the depth buffer.

3.3 // create a frame buffer

3.4 // frame-speaking buffer is bound to the drawing pipeline

3.5 // create a drawing Buffer

3.6 // The lecture drawing buffer is bound to the pipeline

3.7 // allocate space for the drawing buffer (or the rendering buffer)

3.8 // get the width and height of the current drawing buffer (Rendering Buffer)

3.9 // The Rendering Buffer is bound with the frame buffer.

3.10 // check whether the current frame buffer status is valid

3.11 // create an opengl texture object

3.12 // load the fixed point and fragment shader

3.13 // create and initialize this project object

The Code is as follows:

 

// You need to reconstruct the CAEAGLLayer when using Opengles.

+ (Class) layerClass

{

Return [CAEAGLLayer class];

}

-(BOOL) initializeBuffers

{

// Set oglContext to the current context

If ([EAGLContext currentContext]! = OglContext ){

If ([EAGLContext setCurrentContext: oglContext]) {

NSLog (@ setCurrentContext error ......);

}

}

 

BOOL success = YES;

// Set the frame and bounds of the layer

CGRect rtFullscreem = [[UIScreen mainScreen] bounds];

CGRect rtCurrframe = self. layer. frame;

CGRect rtCurrbounds = self. layer. bounds;

Self. layer. frame = rtFullscreem;

Self. layer. bounds = rtFullscreem;

 

NSLog (@ size {% f}, rtFullscreem. origin. x, rtFullscreem. origin. x, rtFullscreem. size. width, rtFullscreem. size. height );

 

// GlEnable (GL_DEPTH_TEST): used to enable the function of updating the depth buffer. That is, if the depth value changes after comparison, the operation of updating the depth buffer is performed. Start it, and OpenGL can trace the pixels on the Z axis. In this way, it will paint the pixel only when there is nothing in front of the pixel.

// Generally, after this function is enabled, 3D rendering is better.

GlDisable (GL_DEPTH_TEST );

// Create a frame buffer

GlGenFramebuffers (1, & frameBufferHandle );

// Bind the frame-speaking buffer to the drawing pipeline.

GlBindFramebuffer (GL_FRAMEBUFFER, frameBufferHandle );

// Create a drawing Buffer

GlGenRenderbuffers (1, & colorBufferHandle );

// The lecture drawing buffer is bound to the pipeline.

GlBindRenderbuffer (GL_RENDERBUFFER, colorBufferHandle );

 

// Allocate space for the drawing buffer (or the rendering buffer)

[OglContext renderbufferStorage: GL_RENDERBUFFER fromDrawable :( CAEAGLLayer *) self. layer];

 

// Obtain the width and height of the current drawing buffer (Rendering Buffer)

GlGetRenderbufferParameteriv (GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, & renderBufferWidth );

GlGetRenderbufferParameteriv (GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, & renderBufferHeight );

// Bind the Rendering Buffer with the frame buffer.

GlFramebufferRenderbuffer (GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, colorBufferHandle );

// Check whether the current frame buffer status is valid

If (glCheckFramebufferStatus (GL_FRAMEBUFFER )! = GL_FRAMEBUFFER_COMPLETE ){

NSLog (@ Failure with framebuffer generation 0x % X, glCheckFramebufferStatus (GL_FRAMEBUFFER ));

Success = NO;

}

 

// Create a new CVOpenGLESTexture cache

// Create an opengl texture object

// Create a texture object in oglContext

CVReturn err = CVOpenGLESTextureCacheCreate (kCFAllocatorDefault, NULL, oglContext, NULL, & videoTextureCache );

If (err ){

NSLog (@ Error at CVOpenGLESTextureCacheCreate % d, err );

Success = NO;

}

 

// Load vertex and fragment shaders

// Load the fixed point and fragment shader

Const GLchar * vertSrc = str_passThrough_v; // [self readFile: @ passThrough. vsh];

Const GLchar * fragSrc = str_passThrough_f; // [self readFile: @ passThrough. HCG];

 

// Attributes

GLint attribLocation [NUM_ATTRIBUTES] = {

ATTRIB_VERTEX, ATTRIB_TEXTUREPOSITON,

};

GLchar * attribName [NUM_ATTRIBUTES] = {

Position, textureCoordinate,

};

// Create and initialize this project object

GlueCreateProgram (vertSrc, fragSrc,

NUM_ATTRIBUTES, (const GLchar **) & attribName [0], attribLocation,

0, 0, 0, // we don't need to get uniform locations in this example

& PassThroughProgram );

 

If (! PassThroughProgram)

Success = NO;

 

Self. layer. frame = rtCurrframe;

Self. layer. bounds = rtCurrbounds;

Return success;

}

Finally, let's take a look at how to scale the video screen at an equal ratio, and so on.

Here we first need to set an attribute:

 

GlVertexAttribPointer (ATTRIB_TEXTUREPOSITON, 2, GL_FLOAT, 0, 0, textureVertices );

TextureVertices is an array used for image setting during texture textures:

Full Screen playback

 

GLfloat squareVertices0 [8] = {

-1.0f,-1.0f,

1.0f,-1.0f,

-1.0f, 1.0f,

1.0f, 1.0f

};

Proportional Scaling

 

GLfloat squareVertices1 [8] = {

-0.5625f,-1.0f,

0.5625f,-1.0f,

-0.5625f, 1.0f,

0.5625f, 1.0f

};

What does this data mean? See the following two figures.

The screen shot is 1920*1080, so 1080/1920 = 0. 5625. Note that the width and height are reversed during shooting.

 

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.