Machine Learning Basics-logistic Regression 2

Source: Internet
Author: User

Random gradient Rise Method--Update the regression coefficients with only one sample point at a time (because the classifier can be incrementally updated when a new sample arrives, thus belonging to an online learning algorithm)

The gradient rise method needs to traverse the entire data set each time the regression system is updated, which is good for processing 100 or so data sets, but if there are billions of samples and thousands of features, the computational complexity of the method is too high.

Random gradient rise algorithm pseudo code:

All regression coefficients are initialized to 1

For each sample in the data set

Calculate the gradient of the sample

Updating regression coefficient values using alpha*gradient

Returns the regression coefficient value

def stocGradAscent0 (Datamatrix, classlabels):    m,n = shape (datamatrix)    alpha = 0.01    weights = Ones (n)   # Initialize to all ones for    I in range (m):        h = sigmoid (sum (datamatrix[i]*weights))        error = Classlabels[i]-h
   weights = weights + Alpha * ERROR * Datamatrix[i]    return weights

The regression coefficients can reach the stable value through a large number of iterations, and there is still a local fluctuation phenomenon.

For the problems in the stochastic gradient algorithm, the improved stochastic gradient ascending algorithm can be used to solve the problem.

def stocGradAscent1 (Datamatrix, Classlabels, numiter=150):    m,n = shape (datamatrix)    weights = Ones (n)   # Initialize to all ones for    J in Range (Numiter):        dataindex = Range (m) for        I in range (m):            alpha = 4/(1.0+j+i ) +0.0001    #apha decreases with iteration, does not             Randindex = Int. (Random.uniform (0,len)) #go to 0 Because of the constant            Index=dataindex[randindex]            h = sigmoid (sum (datamatrix[index]*weights))            error = Classlabels[index]-h            weights = weights + Alpha * ERROR * Datamatrix[index]            del (Dataindex[randindex])    R Eturn weights

Improved:

The 1.alpha adjusts for each iteration, which alleviates fluctuations in the data or high frequencies. Although Alpha will continue to decrease with the number of iterations, it never decreases to 0, guaranteeing that the new data will still have some impact after many iterations.

2. Update the regression coefficients by randomly selecting the samples. This approach will reduce cyclical fluctuations.

Machine Learning Basics-logistic Regression 2

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.