(This article is original, do not reprint without permission)Objective
Handwritten character recognition is an introduction to machine learning, and the K-Nearest neighbor algorithm (KNN algorithm) is an entry-point algorithm for machine learning. This paper introduces the principle of K-Nearest neighbor algorithm, the analysis of handwritten character recognition, the KNN realization and test of handwritten character recognition.
The principle of KNN algorithm
KNN algorithm is a classification algorithm, that is, how to determine a set of input data belongs to what kind of algorithm. KNN belongs to the supervised learning algorithm and must be given a training sample, which includes the input sample and the output sample. Without supervised learning, training samples are not required.
Then the simplest classification method is to compare the input data with the sample, and the most similar first k samples, which most of the k samples belong to which category, the decision of the input data belongs to the class.
From the graph, it is to find out in the sample space with the input data the most recent K data, which most of these data belong to which category, the input data belongs to that category. (Of course, this is the principle of the algorithm, it is not a logical point of view, but whether the input data should be the same class as its K-nearest neighbor is not known, but as an introductory algorithm does not consider this situation.) )
Handwritten digit recognition analysis
- Image preprocessing: Two value, segmentation, unified marking. This process is preprocessed because the process is not part of the KNN algorithm.
Figure 1 Sample input (handwriting "4" and "5")
- Input data formatting: Because the Euclidean distance is used to find K-nearest neighbors, it is best to convert the input image to a vector so that the distance between the input data and the sample data is calculated.
- Looking for K-nearest neighbor: the core process. calculates Euclidean distance and sorts, taking the training samples of the pre-row K.
- Classification decision: The label statistics in the first k training samples, the label with the most occurrences is the result.
Algorithm implementation
- Image preprocessing: Using MATLAB to process the image is not the algorithm itself.
- Input data formatting: For a well-marked picture, the matrix is converted to a vector after the input.
- Finding K-Nearest neighbors:
- Classification decision:
Test
Show the results of the program operation, in the test produced a total of 12 error output, error rate of 1.27%.
Conclusion
KNN algorithm is a simple and effective algorithm, but the algorithm must save the training data set, if the training data set is large, it will occupy a lot of storage space. The time complexity and spatial complexity of the algorithm are not satisfactory, so simple and effective algorithms often sacrifice efficiency, the programmer's self-sacrifice in exchange for efficient algorithms .
Realization of handwritten numeral recognition by K-Nearest neighbor algorithm