Small knowledge:
Gaussian Blur is a widely used technique in image processing, which is usually used to reduce noise and reduce the level of detail. This fuzzy technology produces images that visually look like a translucent screen looking at the image. Gaussian Blur is also used in the preprocessing phase of computer vision algorithm to enhance the image effect under different dimensions.
Typically, the image processing software provides a "blur" (blur) filter that blurs the picture.
There are many kinds of "fuzzy" algorithms, one of which is called "Gaussian Blur" (Gaussian Blur). It uses a normal distribution (also known as a "Gaussian distribution") for image processing.
This article introduces the "Gauss Blur" algorithm, you will see this is a very easy to understand the algorithm. In essence, it is a data smoothing technique (smoothing), which is suitable for many occasions, and image processing provides an intuitive application example.
The principle of Gauss Blur
The term "blur" is understood to mean that every pixel takes the average of the surrounding pixels.
In the above picture, 2 is the middle point, the periphery point is 1.
The "Middle point" takes the average of the "around Point" and becomes 1. In numerical terms, this is a kind of "smoothing". In the graph, it is equivalent to produce a "blur" effect, "middle point" loss of detail.
Category:
- PS Getting Started tutorial