Naive Bayes (Naive Bayes) and Python implementations
Http://www.cnblogs.com/sumai
1. Model
In Gda, we require that eigenvector x be a continuous real vector. If x is a discrete value, it is possible to consider the naive Bayes classification method.
Take the spam classification as an example, using the simplest feature description method, first find an English dictionary, the words are all listed. Each message is then represented as a vector, and each dimension in the vector is a 0/1 value of a word in the dictionary, 1 indicates that the word appears in the message, and 0 indicates that it does not appear.
For example, "a" and "buy" appear in an email without "aardvark", "Aardwolf" and "Zygmurgy",
Then it can be formally expressed as:
If the model is modeled like GDA, then the eigenvector x is subject to a polynomial distribution, and if there are 5,000 words, then X has the possibility of 2^5000, and modeling requires 2^5000-1 parameters, too many parameters. Therefore, it is necessary to change the hypothesis in modeling, the naive Bayes model does not assume the eigenvector x, but assumes each of its component XI, and assumes that the components are independent of each other.
The naïve Bayesian model has the following assumptions about the components Xi and y of the eigenvector x:
So we get:
Only 5,000 parameters are needed here, much less than the number of parameters required before.
2. Evaluation
The logarithmic likelihood function of the model is as follows:
3. Optimization
After the derivation of each parameter, the equation is 0, which gets:
Of course, the naïve Bayesian approach can be extended to cases where both X and Y have multiple discrete values. For a case where the feature is continuous, we can also use a segmented method to convert the continuous value to a discrete value, when the xi|y is subjected to a polynomial distribution rather than a Bernoulli distribution. We can use the method of information gain to determine how to optimize the conversion.
Naive Bayes (Naive Bayes) and Python implementations