Please refer to the original English http://www.deeplearning.net/tutorial/logreg.html
here, we will use Theano to implement the most basic classifiers: Logistic regression, and Learn how mathematical expressions are mapped into Theano diagrams.
Logistic regression is a linear classifier based on probability, W and b are parameters. By projecting an input vector into a set of hyper-planes, each corresponding to a class, the distance from the input to a plane reacts to the probability that it belongs to the corresponding class.
Then the input vector x is the probability of Class I, and the value is shown as follows:
The prediction category is the class with the greatest probability, and:
The code implemented with Theano is as follows:
#Initialize with 0 The weights W as a matrix of shape (n_in, n_out)Self. W =theano.shared (Value=Numpy.zeros (n_in, n_out), Dtype=Theano.config.floatX), name='W', Borrow=True) self.b=theano.shared (Value=Numpy.zeros (n_out), Dtype=Theano.config.floatX), name='b', Borrow=True) self.p_y_given_x= T.nnet.softmax (T.dot (Input, self. W) +self.b) self.y_pred= T.argmax (self.p_y_given_x, Axis=-1)
Since the model's parameters maintain a persistent state in training, we set the w,b as a shared variable and also the Theano symbol variable.
The model currently defined has not done anything useful, and then we will show you how to learn the optimal parameters.
Defining loss Functions (Loss function)
For multi-class regression, it is common to use negative log-likelihood as a loss.
To maximize the likelihood function of DataSet D under the parameter θ, let us first define the likelihood function and the loss:
Here we use the method of random gradient descent to find the minimum value.
Creating a logistic regression class
Code please refer to source URL: http://www.deeplearning.net/tutorial/logreg.html
Using logistic regression to classify handwritten numerals mnist