Source: 袁峥 Seemygo
Links: Http://www.jianshu.com/p/4646894245ba
Objective
Before reading this article, if you do not know how the live broadcasting principle, please check this article how to quickly develop a complete iOS Live app (principle)
Development of a live app, beauty function is very important, if there is no beauty function, may be divided into minutes of powder tens of millions, this article mainly on the live Chinese and American Yan function realization principle, and to achieve beauty function.
The process of using gpuimage to deal with Sino-American Beauty in live process
Beauty principle. png
Basic Beauty Concept
GPU: (Graphic Processor Unit Graphics Processing Unit) hardware for image processing and rendering on a phone or computer
GPU Operating principle: The CPU specifies the display controller to work, the display controller according to the control of the CPU to the designated place to fetch data and instructions, the current data is generally from the memory, if the memory is not stored, then taken from the RAM, not put out the RAM, then from the hard disk, of course, is not memory put down, But to save memory, it can be placed on the hard disk, and then through the command control display controller to fetch.
OpenGL ES: (Open graphics Library for Embedded (embedded) systems open source embedded System Graphics processing framework), a set of graphics and hardware interface for the processing of pictures to display on the screen.
Gpuimage: is an open source iOS framework based on OpenGL ES 2.0 image and video processing, offers a wide range of image processing filters, supports real-time filters for cameras and cameras, has more than 120 filter effects, and is capable of customizing image filters.
Filter processing principle: Is the static picture or video of each frame of the graphic transformation and then display it. Its essence is the pixel coordinates and color changes.
Gpuimage Processing Screen principle
Gpuimage adopt the chain way to deal with the screen, through the Addtarget: method for the chain to add each link of the object, processing a target, will be processed in the previous link to the image of the next target to deal with, called the Gpuimage processing chain.
The target of intermediate links is generally a variety of filter, is Gpuimagefilter or sub-class.
The final link target, Gpuimageview: Used to display to the screen, or Gpuimagemoviewriter: written in a video file.
For example: Sunglasses principle, from the outside world Light, will pass through the sunglasses filter, in passing to our eyes, you can feel the daytime is also black, haha.
The general target can be divided into two categories
Gpuimage treatment is divided into 3 main links
Gpuimagevideocamera: For real-time video capture
Gpuimagestillcamera: For taking photos in real time
Gpuimagepicture: Used to process pictures that have been photographed, such as png,jpg pictures
Gpuimagemovie: Used to process videos that have been taken, such as MP4 files
Source (video, picture sources), filter (filter), final target (post-processing video, pictures)
Gpuimaged Source: All Inherit the Gpuimageoutput subclass, as the gpuimage data source, like the outside light, as the output source of the eye
Gpuimage Filter:gpuimagefilter class or subclass, this class inherits from Gpuimageoutput, and complies with Gpuimageinput protocol, so that both can flow into, and can outflow, like our sunglasses, light through the sunglasses processing, And finally into our eyes
The final target:gpuimageview,gpuimagemoviewriter of gpuimage is like our eyes, and we finally enter the target.
Gpuimage processing principle. png
Beauty principle
Matte (Gpuimagebilateralfilter): The essence is to let the pixel blur, you can use Gaussian blur, but may lead to unclear edges, with a bilateral filter (bilateral filter), targeted fuzzy pixels, can ensure that the edges are not blurred.
Whitening (Gpuimagebrightnessfilter): The essence is to improve the brightness.
Beauty effect
Gpuimage Native Beauty effect
Gpuimage native. gif
Achieve results with the beauty filter
(Note: Because the GIF motion chart exceeds the maximum limit, cannot upload)
Gpuimage Combat
Gpuimage Native Beauty
Step One: Import gpuimage using Cocoapods
Step Two: Create a video source Gpuimagevideocamera
Step three: Create the final destination source: Gpuimageview
Step Four: Create a filter group (Gpuimagefiltergroup) that requires combination brightness (gpuimagebrightnessfilter) and bilateral filtering (gpuimagebilateralfilter) to achieve a beauty effect.
Step Five: Set the filter group chain
Step Six: Set up the gpuimage processing chain, from the data source to the filter and the final interface effect
Step Seven: Start capturing video
Note the point:
Sessionpreset the best use of Avcapturesessionpresethigh, will automatically identify, if using too high resolution, the current device does not support will be directly error
Gpuimagevideocamera must be a strong reference, otherwise it will be destroyed, can not continue to collect video.
Must call Startcameracapture, the underlying will be the captured video source, rendering into the gpuimageview, it can be displayed.
The smaller the Distancenormalizationfactor value of Gpuimagebilateralfilter, the better the grinding effect, the Distancenormalizationfactor value range: greater than 1.
- (void)viewdidload {
[Super viewdidload];
//Create a video source
///Sessionpreset: screen resolution, Avcapturesessionpresethigh Adaptive High resolution
//cameraposition: Camera Orientation
Gpuimagevideocamera *videocamera = [[gpuimagevideocamera alloc] Initwithsessionpreset:avcapturesessionpresethigh cameraposition: Avcapturedevicepositionfront];
videocamera. Outputimageorientation = uiinterfaceorientationportrait;
_videocamera = videocamera;
//Create final Preview view
Gpuimageview *capturevideopreview = [[gpuimageview alloc] initwithframe: self. View. Bounds];
[self. View insertsubview:capturevideopreview atindex:0];
//Create filters: Peeling, Whitening, combo filter
Gpuimagefiltergroup *groupfilter = [[gpuimagefiltergroup alloc] Init ];
//Peeling filter
gpuimagebilateralfilter *bilateralfilter = [[gpuimagebilateralfilter alloc] Init];
[groupfilter addTarget:bilateralfilter];
_bilateralfilter = bilateralfilter;
//Whitening filter
gpuimagebrightnessfilter *brightnessfilter = [[gpuimagebrightnessfilter alloc] Init];
[groupfilter addTarget:brightnessfilter];
_brightnessfilter = brightnessfilter;
//Set filter group chain
[bilateralfilter addTarget:brightnessfilter];
[groupfilter setinitialfilters:@[bilateralfilter]];
groupfilter. Terminalfilter = brightnessfilter;
//Set the Gpuimage response chain, from the data source to the filter and the final interface effect
[videocamera addTarget:groupfilter];
[groupfilter addTarget:capturevideopreview];
//Must call Startcameracapture, the bottom will be the captured video source, rendering into the gpuimageview, it can be displayed.
//Start capturing video
[videocamera startcameracapture];
}
- (ibaction)brightnessfilter:(uislider *)sender {
_brightnessfilter. Brightness = Sender. Value;
}
- (ibaction)bilateralfilter:(uislider *)sender {
//The smaller the value, the better the skin effect
cgfloat MaxValue = ten;
[_bilateralfilter setdistancenormalizationfactor:(maxValue - sender. Value)];
}
Use the beauty filter for
Step One: Import gpuimage using Cocoapods
Step two: Import the Gpuimagebeautifyfilter folder
Step three: Create a video source Gpuimagevideocamera
Step four: Create the final destination source: Gpuimageview
Step five: Create a final beauty filter: Gpuimagebeautifyfilter
Step Six: Set up the gpuimage processing chain, from the data source to the filter and the final interface effect
Attention:
- (void)viewdidload {
[Super viewdidload];
additional setup after loading the view.
//Create a video source
///Sessionpreset: screen resolution, Avcapturesessionpresethigh Adaptive High resolution
//cameraposition: Camera Orientation
Gpuimagevideocamera *videocamera = [[gpuimagevideocamera alloc] Initwithsessionpreset:avcapturesessionpresethigh cameraposition: Avcapturedevicepositionfront];
videocamera. Outputimageorientation = uiinterfaceorientationportrait;
_videocamera = videocamera;
//Create final Preview view
Gpuimageview *capturevideopreview = [[gpuimageview alloc] initwithframe :self. View. Bounds];
[self. View insertsubview:capturevideopreview atindex:0];
_capturevideopreview = capturevideopreview;
//Set up processing chain
[_videocamera addTarget:_capturevideopreview];
//Must call Startcameracapture, the bottom will be the captured video source, rendering into the gpuimageview, it can be displayed.
//Start capturing video
[videocamera startcameracapture];
}
- (ibaction)openbeautifyfilter:(uiswitch *)sender {
//Toggle Beauty Effect: Remove all previous processing chains and reset the processing chain
if (sender. On) {
//Remove all previous processing chains
[_videocamera removealltargets];
//Create a beauty filter
gpuimagebeautifyfilter *beautifyfilter = [[gpuimagebeautifyfilter alloc] init];
//Set gpuimage processing chain, from data source to filter = + Final interface effect
[_videocamera addTarget:beautifyfilter];
[beautifyfilter addTarget:_capturevideopreview];
} else {
//Remove all previous processing chains
[_videocamera removealltargets];
[_videocamera addTarget:_capturevideopreview];
}
}
Gpuimage extension
Gpuimage All filters introduced
Http://www.360doc.com/content/15/0907/10/19175681_497418716.shtml
Beauty Filter
Http://www.jianshu.com/p/945fc806a9b4
Beauty 美图秀秀 Filter Grand Summary
Http://www.tuicool.com/articles/6bIbQbQ
SOURCE download
SOURCE Https://github.com/iThinkerYZ/GPUImgeDemo
Note: The first time you open the pod install
Conclusion
Follow-up will also explain the gpuimage principle OpenGL ES, Video coding, push stream, chat room, gift system and more features, please pay attention!!!
"How to quickly develop a complete IOS Live App" (Beauty article)