Top 10 classic algorithms for data mining (8) KNN: K-Nearest Neighbor Classification

Source: Internet
Author: User

Adjacent Algorithm

KNN algorithm decision-making process

K-Nearest Neighbor Algorithm

In the picture on the right, the Green Circle is determined to be assigned to which class, is it a red triangle or a blue square? If K = 3, because the proportion of the red triangle is 2/3, the green circle will be assigned to the class of the Red Triangle. If K = 5, because the proportion of the blue square is 3/5, therefore, the Green Circle is given a blue square category.
K-nearest
Neighbor, KNN) classification algorithm is a theoretically mature method and one of the simplest machine learning algorithms. The idea of this method is: if a sample has k elements in the feature space
If most of the samples (that is, the most adjacent in the feature space) belong to a specific category, the sample also belongs to this category. In KNN algorithm, the selected neighbors are objects that have been correctly classified. This method is used for determining the class
The category of the samples to be classified is determined only based on the class of the nearest one or several samples.
The KNN method also relies on the Limit Theorem in principle, but in classification decision-making, it is only related to a very small number of adjacent samples. Since the KNN method mainly relies on a limited number of adjacent samples, rather than the method used to determine the similarity
The KNN method is more suitable than other methods for determining the category of a class.
KNN can be used for classification and regression. You can obtain the attributes of a sample by finding K nearest neighbors and assigning the average values of these neighbor attributes to the sample. A more useful method is to give different weights to the impact of different distance neighbors on the sample, for example, the weights are proportional to the distance.
The main disadvantage of this algorithm in classification is that when samples are unbalanced, for example, the sample size of a class is large while that of other classes is large.
Very small, it is possible that when a new sample is input, a large number of samples in the K-neighbor capacity class of the sample account for the majority. Therefore, we can use the method of weight (a large neighbor weight with a small distance from the sample) to improve. This method
Another disadvantage is that the calculation workload is large, because the distance from each text to be classified must be calculated to all known samples before K Nearest Neighbor points can be obtained. Currently, common solutions are known in advance.
This point is edited to remove samples that have little effect on classification in advance. This algorithm is more suitable for automatic classification of class domains with a large sample size, while those with a small sample size use this algorithm to easily produce errors.
Points.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.