Knn
- Simple thinking
- Less knowledge required for mathematics (nearly 0)
- Good effect
- Explains many of the details of the machine learning algorithm using the process
- A more complete process for characterizing machine learning applications
K Nearest Neighbor Essence: If two samples are similar enough, they may fall into the same category.
e.g. Green Dot is the newly added point, take its nearest K (3) point as a small group to vote, the number of votes high wins (blue than red -3:0), so green Dot should also be blue dot
Calculate Distance:
The most common Euler distance, the distance of a, b two points (two-dimensional, three-dimensional, multidimensional):
-
Understanding Small Notes: ((a sample first dimension feature-B Sample First dimension feature) 2 + (a sample second dimension feature-B sample Second dimension feature) 2 + ...) again open the root of
it is almost possible to say that the KNN algorithm is the only algorithm in machine learning that does not require a training process. the input use case can be sent directly to the training data set.
- KNN can be thought of as an algorithm without a model
- To unify with other algorithms, it can be considered that the training data set itself is a model
Knn-k Nearest Neighbor algorithm