KNN (k nearest neighbor) algorithm __ algorithm

Source: Internet
Author: User

Introduction

K Nearest neighbor algorithm is called KNN (k Nearest Neighbor) algorithm, this algorithm is a relatively classical machine learning algorithm, wherein the k represents the closest to their own K data samples.

the difference between KNN and K-means algorithm

The K-means algorithm is used for clustering, which is used to determine which sample is a relatively similar type and belongs to the unsupervised algorithm. The KNN algorithm is used for classification. That is, the label of a sample in a dataset has been determined, and then, given a data x to be classified, by calculating the nearest K-sample to determine which category of the data to classify, category C accounts for the most, and the label of X is set to C. It is obvious that KNN belongs to supervisory algorithm.

There is a more classic figure in the KNN entry on Wikipedia:



As we can see from the above figure, there are two types of sample data in the graph, one is the blue square and the other is the red triangle. And that green circle is the data we're classifying.
If k=3, then the nearest green point has 2 red triangles and a blue square, these 3 points to vote, so the green of the classification point belongs to the red triangle. If k=5, then the nearest green point has 2 red triangles and 3 blue squares, these 5 points to vote, so the green of the classification point belongs to the Blue Square.


Pros & Cons

In the KNN algorithm, the selected neighbors are the objects that have been correctly categorized. This method determines the category to which the sample is to be divided based on the category of the nearest one or several samples in the categorical decision-making. Although the KNN method relies on the limit theorem in principle, it is only related to a very small number of adjacent samples in the class decision. The KNN method is more suitable than other methods because the KNN method mainly relies on the surrounding finite sample, rather than the Discriminant class domain method to determine the category of the class.


The main disadvantage of the KNN algorithm is that when the sample is unbalanced, the sample size of a class is large, and the other class sample capacity is very small, it is possible that when a new sample is entered, the sample of the large-capacity class in the K-neighbor of the specimen is the majority. Therefore, it can be improved by weight-based method (which is significant to the neighbor with small distance and the weight of the distance is small).


Summary

KNN algorithm is sensitive to the initial value, generally need to be combined with other advanced classification algorithms, the effect will be more ideal.




Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.