Random gradient Rise Method--Update the regression coefficients with only one sample point at a time (because the classifier can be incrementally updated when a new sample arrives, thus belonging to an online learning algorithm)
The gradient rise method needs to traverse the entire data set each time the regression system is updated, which is good for processing 100 or so data sets, but if there are billions of samples and thousands of features, the computational complexity of the method is too high.
Random gradient rise algorithm pseudo code:
All regression coefficients are initialized to 1
For each sample in the data set
Calculate the gradient of the sample
Updating regression coefficient values using alpha*gradient
Returns the regression coefficient value
def stocGradAscent0 (Datamatrix, classlabels): m,n = shape (datamatrix) alpha = 0.01 weights = Ones (n) # Initialize to all ones for I in range (m): h = sigmoid (sum (datamatrix[i]*weights)) error = Classlabels[i]-h
weights = weights + Alpha * ERROR * Datamatrix[i] return weights
The regression coefficients can reach the stable value through a large number of iterations, and there is still a local fluctuation phenomenon.
For the problems in the stochastic gradient algorithm, the improved stochastic gradient ascending algorithm can be used to solve the problem.
def stocGradAscent1 (Datamatrix, Classlabels, numiter=150): m,n = shape (datamatrix) weights = Ones (n) # Initialize to all ones for J in Range (Numiter): dataindex = Range (m) for I in range (m): alpha = 4/(1.0+j+i ) +0.0001 #apha decreases with iteration, does not Randindex = Int. (Random.uniform (0,len)) #go to 0 Because of the constant Index=dataindex[randindex] h = sigmoid (sum (datamatrix[index]*weights)) error = Classlabels[index]-h weights = weights + Alpha * ERROR * Datamatrix[index] del (Dataindex[randindex]) R Eturn weights
Improved:
The 1.alpha adjusts for each iteration, which alleviates fluctuations in the data or high frequencies. Although Alpha will continue to decrease with the number of iterations, it never decreases to 0, guaranteeing that the new data will still have some impact after many iterations.
2. Update the regression coefficients by randomly selecting the samples. This approach will reduce cyclical fluctuations.
Machine Learning Basics-logistic Regression 2