Softmax Regression and Python code

Source: Internet
Author: User

Softmax regression is a generalization of logistic regression on multi-classification problems, which is mainly used to deal with multi-classification problems, among which any two categories are linearly divided.

Assuming there are $k$ categories, the parameter vectors for each category are ${\theta}_j $, then for each sample, the probability of its owning category is:

\[p ({{y}_{i}}| X,{{\theta}_{j}}) =\frac{{{e}^{{{\theta}_{j}}x}}}{\sum\limits_{l=1}^{k}{{{e}^{{{\theta}_{l}}X}}}}\]

Compared to loss functions such as logistic regression, the loss function of Softmax introduces an indicator function, and its loss function is:

$J \left (\theta \right) =-\frac{1}{m}\left[\sum\limits_{i=1}^{m}{\sum\limits_{j=1}^{k}{i\left\{{{y}_{i}}=j \right\ }\log \frac{{{e}^{{{\theta}_{j}}x}}}{\sum\limits_{l=1}^{k}{{{e}^{{{\theta}_{l}}x}}}} \right]$

The meaning of the loss function is to determine which category each sample belongs to and calculate accordingly. For this loss function, the gradient descent method can be used to solve the gradient calculation process as follows:

${{\nabla}_{{{\theta}_{j}}}}j (\theta) =-\frac{1}{m}\sum\limits_{i=1}^{m}{\left[{{\nabla}_{{{\theta}_{j}}}}\sum\ Limits_{l=1}^{k}{i\{{{y}_{i}}=j\}\log \frac{{{e}^{{{\theta}_{j}}x}}}{\sum\limits_{l=1}^{k}{{{e}^{{{\theta}_{l}} X}}}} \right]}$

$ =-\frac{1}{m}\sum\limits_{i=1}^{m}{[i\{{{y}_{i}}=j\}\frac{\sum\limits_{l=1}^{k}{{{e}^{{{\theta}_{l}}X}}}}{{{e }^{{{\theta}_{j}}x}}}\cdot \frac{{{e}^{{{\theta}_{j}}x}}\cdot x\cdot \sum\limits_{l=1}^{k}{{{e}^{{{\theta}_{l}}X} }}-{{e}^{{{\theta}_{j}}x}}\cdot {{E}^{{{\theta}_{j}}x}}\cdot X}{{{\sum\limits_{l=1}^{k}{{{e}^{{{\theta}_{l}}X}}} }^{2}}}]}$

$ =-\frac{1}{m}\sum\limits_{i=1}^{m}{i\{{{y}_{i}}=j\}\frac{\sum\limits_{l=1}^{k}{{{e}^{{{\theta}_{l}}X}}}-{{e}^ {{{\theta}_{j}}x}}} {\sum\limits_{l=1}^{k}{{{e}^{{{\theta}_{l}}x}}}} \cdot X $

$=-\frac{1}{m}\sum\limits_{i=1}^{m}{\left[(I\{{{y}_{i}}=j\}-p ({{y}_{i}}=j| | X,{{\theta}_{j})) \cdot X \right]} $

For each category, the gradient of its ${\theta}_j$ is calculated separately and the Python code is as follows:

#-*-coding:utf-8-*-"""Created on Sun Jan 15:32:44 2018@author:zhang"""ImportNumPy as NP fromSklearn.datasetsImportload_digits fromSklearn.cross_validationImportTrain_test_split fromSklearnImportpreprocessingdefload_data (): Digits=load_digits () data=digits.data Label=Digits.targetreturnNp.mat (data), labeldefgradient_descent (train_x, train_y, K, maxcycle, Alpha):#K is the number of categoriesNumSamples, numfeatures =Np.shape (train_x) Weights=Np.mat (Np.ones (numfeatures, K )) forIinchRange (maxcycle): Value= Np.exp (train_x *weights) Rowsum= Value.sum (axis = 1)#Horizontal SummationRowsum = Rowsum.repeat (k, axis = 1)#Horizontal Replication ExtensionErr =-Value/rowsum#calculate the probability that each sample belongs to each category           forJinchRange (numsamples): Err[j, Train_y[j]]+ = 1Weights= weights + (alpha/numsamples) * (TRAIN_X.T *err)returnWeightsdefTest_model (test_x, test_y, weights): Results= test_x *Weights predict_y= Results.argmax (Axis = 1) Count=0 forIinchRange (Np.shape (test_y) [0]):ifPredict_y[i,] = =Test_y[i,]: Count+ = 1returnCount/Len (test_y), predict_yif __name__=="__main__": Data, label=Load_data ()#data = Preprocessing.minmax_scale (data, axis = 0)#the recognition rate is reduced after data processingTrain_x, test_x, train_y, test_y = train_test_split (data, label, test_size = 0.25, random_state=33) K=Len (np.unique (label)) weights= Gradient_descent (train_x, train_y, K, 800, 0.01) accuracy, predict_y=Test_model (test_x, test_y, weights)Print("accuracy:", accuracy)

Softmax Regression and Python code

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.