Original Site: http://blog.csdn.net/bugrunner/article/details/7170471
1. Introduction
Image Smoothing is an important operation and there are a variety of mature algorithms. Here we will briefly introduce bilateral (bilateral filtering). This is mainly because ssao was implemented some time ago and bilateral blur algorithm is required for noise reduction. Compared with traditional Gaussian blur, bilateral blur is an important feature that can maintain edge (edge perseving). This feature is useful for image blur. Generally, Gaussian Blur mainly considers the spatial distance between pixels during sampling, but does not consider the similarity between pixel values, therefore, the fuzzy result we get is usually a blur of the entire image. The improvement of bilateral blur lies in the fact that the relationship between pixels in spatial distance is not only considered during sampling, but also the similarity between pixels is considered. Therefore, the basic blocks of the original image can be maintained to maintain the edge. In the post blur algorithm of the game engine, bilateral blur is often used, such as noise reduction for ssao.
2. Principles
In the filtering algorithm, the pixel value at the target point is usually determined by the value of a small local neighbor pixel around its location. The specific implementation of 2D Gaussian filtering is to assign different Gaussian weights to the pixel values in a certain range, and obtain the final result of the current vertex after the weighted average. Here, the Gaussian weight factor is generated based on the spatial distance between two pixels (2d in the image. Through the Gaussian distribution curve, we can find that the closer the target pixel to the point, the greater the contribution to the final result, and the smaller the vice versa. The formula is generally described as follows:
TheCIt is a Gaussian weight based on spatial distance, and is used to unit the result.
Gaussian filtering has a good performance in the lowpass filtering algorithm, but it has another problem, that is, it only considers the relationship between the spatial locations of pixels, therefore, the filtered result will lose the edge information. The edge here mainly refers to the main different color areas in the image (such as blue sky, black hair, etc ), bilateral adds another weight segment to Gaussian blur to solve this problem. In bilateral filtering, edge persistence is achieved through the following expressions:
TheSIt is a Gaussian weight based on the degree of similarity between pixels, and is also used to unit the results. By combining the two, you can obtain bilateral filter based on comprehensive consideration of spatial distance and similarity:
The Unit Division in the above formula is obtained by combining the two Gaussian weights.CAndSThe calculation can be described in detail as follows:
And has
And has
The Expressions given above are all infinite points in space, but they certainly cannot be done in pixelbased images, and there is no need to do so, therefore, we need to discretization it before use. In addition, you do not need to perform weighted operations on each partial pixel from the entire image. pixels with a distance greater than a certain degree actually have little impact on the current target pixel and can be ignored. The discretization of a local subarea can be simplified as follows:
The above theoretical formula forms the basis for bilateral filtering. To intuitively understand the differences between Gaussian filtering and bilateral filtering, we can see the basis from the following illustration. Assume that the source image of the target is a noisy image (automatically generated by the Program) in the left and right areas, and the center of the blue box is the location of the target pixel, the Gaussian weights corresponding to the current pixel location and bilateral weighting factors are displayed in the following two figures after 3D visualization:
The left is the original noise image, the center is the Gaussian sampling weight, and the right is the bilateral sampling weight. We can see that bilateral can filter out the vertices on the left of the source image that differ too much from the current pixel after adding the similarity division, so that the edge is well maintained. To better observe the differences between the two, use MATLAB to draw a 3D height chart in two different ways, as shown below:
The above three figures are: Bilateral filtering, original image, and Gaussian filtering from left to right. From the height chart, we can see the difference between bilateral and Gaussian. The former well maintains the gradient at the edge, while in Gaussian filter, because the variation at the edge is linear, the gradient is used to show the gradient, this is manifested in the loss of borders in the image (the image example can be seen later ).
3. Code Implementation
With the above theory, it is relatively simple to implement bilateral filter. In fact, it is not much different from ordinary Gaussian blur. There are three main operations: generating weights based on spatial distance, generating weights based on similarity, and calculating the final filter color.
3.1 spatial weight
This is the Gaussian weight calculation method used in Gaussian blur. It is calculated based on the distance between two pixels and using the following formula:
The distance between two pixels is represented. For example, the distance between the current pixel and a pixel next to the right can be calculated, that is, twodimensional vectors.{0, 0}And{0, 1}Euclidean distance between them. Gaussian Blur can be performed after the Gaussian weights of a region are directly calculated and units are made.
3.2 similarity weight
It is similar to distancebased Gaussian weight calculation, except that it is not based on the spatial distance between two pixel, but on the degree of similarity (or the distance between two pixel values ).
It indicates the distance between two pixel values. You can directly use the difference value between gray values or the Euclidean distance between RGB vectors.
3.3 Color Filtering
With the necessary weighting factors, the implementation of bilateral filtering is no different from that of common Gaussian filtering. The main code is as follows:
[CPP]View plaincopyprint?
 Uchar3 bbcolor (INT posx, int posy)
 {
 Int centeritemindex = posy * picwidth4 + posx * 3, neighbouritemindex;
 Int weightindex;
 Double gsaccumweight = 0;
 Double accumcolor = 0;
 // Calculate the Gaussian weights at each sampling point, including closeness and similarity.
 For (INT I =number; I <= number; ++ I)
 {
 For (Int J =number; j <= number; ++ J)
 {
 Weightindex = (I + number) * (Number * 2 + 1) + (J + number );
 Neighbouritemindex = min (noiseimageheight1, max (0, Posy + J * radius) * picwidth4 +
 Min (noiseimagewidth1, max (0, posx + I * radius) * 3;
 Pcsweight [weightindex] = lookupgsweighttable (psrcdatabuffer [neighbouritemindex], psrcdatabuffer [centeritemindex]);
 Pcsweight [weightindex] = pgsweight [weightindex] * pgcweight [weightindex];
 Gsaccumweight + = pcsweight [weightindex];
 }
 }
 // Unitbased Weight Factor
 Gsaccumweight = 1/gsaccumweight;
 For (INT I =number; I <= number; ++ I)
 {
 For (Int J =number; j <= number; ++ J)
 {
 Weightindex = (I + number) * (Number * 2 + 1) + (J + number );
 Pcsweight [weightindex] * = gsaccumweight;
 }
 }
 // Calculate the final color and return it
 For (INT I =number; I <= number; ++ I)
 {
 For (Int J =number; j <= number; ++ J)
 {
 Weightindex = (I + number) * (Number * 2 + 1) + (J + number );
 Neighbouritemindex = min (noiseimageheight1, max (0, Posy + J * radius) * picwidth4 +
 Min (noiseimagewidth1, max (0, posx + I * radius) * 3;
 Accumcolor + = psrcdatabuffer [neighbouritemindex + 0] * pcsweight [weightindex];
 }
 }
 Return uchar3 (accumcolor, accumcolor, accumcolor );
 }
Uchar3 bbcolor (INT posx, int posy) {int centeritemindex = posy * picwidth4 + posx * 3, neighbouritemindex; int weightindex; double gsaccumweight = 0; double accumcolor = 0; // calculate the Gaussian weights at each sampling point, including closeness and similarityfor (INT I =number; I <= number; ++ I) {for (Int J =number; j <= number; ++ J) {weightindex = (I + number) * (Number * 2 + 1) + (J + number); neighbouritemindex = min (noiseimageheight1, max (0, Posy + J * radius) * picwidth4 + min (noiseimagewidth1, max (0, posx + I * radius) * 3; pcsweight [weightindex] = Week (response [neighbouritemindex], psrcdatabuffer [centeritemindex]); pcsweight [weightindex] = pgsweight [weightindex] * pgcweight [weightindex]; gsaccumweight + = pcsweight [weightindex] ;}// unit weighting factor gsaccumweight = 1/gsaccumweight; For (INT I =number; I <= number; ++ I) {for (Int J =number; j <= number; ++ J) {weightindex = (I + number) * (Number * 2 + 1) + (J + number); pcsweight [weightindex] * = gsaccumweight;} // calculate the final color and return for (INT I =number; I <= number; ++ I) {for (Int J =number; j <= number; ++ J) {weightindex = (I + number) * (Number * 2 + 1) + (J + number); neighbouritemindex = min (noiseimageheight1, max (0, Posy + J * radius) * picwidth4 + min (noiseimagewidth1, max (0, posx + I * radius) * 3; accumcolor + = psrcdatabuffer [neighbouritemindex + 0] * pcsweight [weightindex];} return uchar3 (accumcolor, accumcolor, accumcolor );}
Weight of similarity segmentsSIt is mainly calculated based on the color difference between two pixels. For grayscale images, the range of the difference value is predictable, that is, [255,255]. To improve the computing efficiency, we can precalculate this part of the Weight Factor to generate a coexistence table, you can query it quickly when using it. The result of filtering several noisy images using the above algorithm is as follows:
From left to right: Bilateral filtering, original image, and Gaussian filtering. The difference between the two algorithms can be clearly seen from the image. The most intuitive difference is that the whole image is blurred after the Gaussian algorithm is used; bilateral filtering can better maintain the region information in the original image, and it looks like the mouth and eye (especially in the first beautiful image! PS is very important ~~ ^ O ^ ).
4. Use in ssao
In the above implementation, the edge determination function is mainly determined by the difference between two pixel values, which is also a common way to observe common images. Of course, you can also use other methods to determine the edge of other definitions based on your needs, such as the depth or normal of the scenario. For example, when filtering ssao, you can use the depth value to determine the edge. First, set a threshold value for depth to compare the depth difference between two points during edge detection. If the difference value is greater than the threshold value, it is considered to belong to different regions, then it should be a boundary here. The results obtained by using this method are as follows:
Gaussian filter
Bilateral Filtering
After the filtered ssao image is obtained, the final rendering effect can be achieved by directly integrating with the original image, as shown in:
Ssao disabled
Ssao Enabled
Postscript: I started with a blog post in 2012. It feels good. Come on ~!~!
 Last screen space ambient occlusion (ssao)
 Next, use xelatex/xetex to compile Chinese documents

Top

8

Step on
Bilateral filtering (bilateral filtering) for ssao