Support Vector Machine for Nonlinear Regression -- Matlab source code

Source: Internet
Author: User
Tags svm

Both SVM and neural networks can be used for Nonlinear Regression fitting, but their principles are different. SVM is based on the Structure Risk Minimization theory, it is generally considered that the generalization capability is better than that of neural networks. A large number of simulations have proved that SVM is more generalized than neural networks, and can avoid the inherent defect of neural networks-the training results are unstable. This source code can be used in multiple application scenarios, such as linear regression, nonlinear regression, nonlinear function fitting, data modeling, prediction, and classification.

Function [alpha1, alpha2, Alpha, flag, B] = svmnr (X, Y, Epsilon, C, tkf, para1, para2)
%
% Svmnr. m
% Support Vector Machine for Nonlinear Regression
% All Rights Reserved
%
% General program for Support Vector Machine Nonlinear Regression
% Program functions:
% Use Support Vector Machine for nonlinear regression to obtain the nonlinear function y = f (x1, x2 ,..., XN) support vector analysis,
% The quadprog function of the optimization toolbox is called when solving the quadratic planning. This function performs data processing at the entrance of the program.
% [-] Normalization, so the calculated regression analytical coefficient is for the normalization data, the simulation test
% Use the regression function that matches this function.
% Input parameter list
% X original sample data input, n × L matrix, n is the number of variables, L is the number of samples
% Y: original sample data output, 1 × L matrix, and l indicates the number of samples.
% Epsilon ε parameter of the insensitive loss function. The larger the Epsilon, the fewer support vectors.
% C penalty coefficient. C is too large or too small, and the generalization ability is deteriorated.
% Tkf type of kernel function Core Function Type
% Tkf = 1 linear kernel function. Note: Linear Regression of SVM is performed using linear kernel function.
% Tkf = 2 polynomial kernel function
% Tkf = 3 radial basis function
% Tkf = 4 Exponential Core Function
% Tkf = 5 sigmoid Core Function
% Tkf = any other value, custom Kernel Function
The first parameter in the % para1 Kernel Function
The second parameter in the % para2 Kernel Function
% Note: For definitions of core function parameters, see regression. M and svmnr. M.
% Output parameter list
% Alpha1 α coefficient
% Alpha2 α * Coefficient
% Alpha support vector weighting coefficient (α-α *) vector
% Flag 1 × L tag, 0 corresponds to non-support vector, 1 corresponds to boundary support vector, 2 corresponds to Standard Support Vector
% B constant in Regression Equation
% --------------------------------------------------------------------------

%
% ----------------------- Data normalization --------------------------------------
Nntwarn off
X = premnmx (X );
Y = premnmx (y );
%
%
% ----------------------- Initialize the kernel function parameter ------------------------------------
Switch tkf
Case 1
% Linear kernel function K = sum (X. * Y)
% No parameter to be defined
Case 2
% Polynomial kernel function k = (sum (X. * Y) + C) ^ P
C = para1; % C = 0.1;
P = para2; % P = 2;
Case 3
% Radial basis function K = exp (-(norm (x-y) ^ 2/(2 * Sigma ^ 2 ))
Sigma = para1; % Sigma = 6;
Case 4
% Exponent kernel function K = exp (-norm (x-y)/(2 * Sigma ^ 2 ))
Sigma = para1; % Sigma = 3;
Case 5
% Sigmoid core function k = 1/(1 + exp (-V * sum (X. * Y) + C ))
V = paral; % v = 0.5;
C = para2; % C = 0;
Otherwise
% The Custom kernel function must be modified by the user within the function. Note that you need to modify the function several times at the same time!
% Is temporarily defined as K = exp (-(sum (x-y). ^ 2)/(2 * Sigma ^ 2 )))
Sigma = para1; % Sigma = 8;
End
%
%
% ----------------------- Construct K matrix -------------------------------------------
L = size (x, 2 );
K = zeros (L, L); % K matrix Initialization
For I = 1: l
For j = 1: l
X = x (:, I );
Y = x (:, J );
Switch tkf % uses the corresponding kernel function to construct the K matrix based on the type of the kernel function
Case 1
K (I, j) = sum (X. * y );
Case 2
K (I, j) = (sum (X. * Y) + C) ^ P;
Case 3
K (I, j) = exp (-(norm (x-y) ^ 2/(2 * Sigma ^ 2 ));
Case 4
K (I, j) = exp (-norm (x-y)/(2 * Sigma ^ 2 ));
Case 5
K (I, j) = 1/(1 + exp (-V * sum (X. * Y) + C ));
Otherwise
K (I, j) = exp (-(sum (x-y). ^ 2)/(2 * Sigma ^ 2 )));
End
End
End
%
%
% ------------ Construct the parameters H, FT, aeq, beq, LB, UB ------------------------ of the quadratic Planning Model ------------------------
% Support Vector Machine nonlinear regression. The regression function coefficient must be determined by solving a quadratic programming model.
FT = [Epsilon * ones (1, L)-y, Epsilon * ones (1, L) + Y];
Aeq = [ones (1, L),-ones (1, L)];
Beq = 0;
UB = C * ones (2 * l, 1 );
%
%
% -------------- Call the optimization toolkit quadprog function to solve the quadratic programming ------------------------
Opt = optimset;
Opt. largescale = 'off ';
Opt. Display = 'off ';
%
%
% ------------------------ Sort out the coefficients of the output regression equation ------------------------------
Alpha1 = (gamma (1: l, 1 ))';
Alpha2 = (gamma (L + 1): end, 1 ))';
Alpha = Alpha1-Alpha2;
Flag = 2 * ones (1, L );
%
%
% --------------------------- Classification of SVM ----------------------------------
Err = 0.000000000001;
For I = 1: l
AA = alpha1 (I );
BB = alpha2 (I );
If (ABS (AA-0) <= ERR) & (ABS (BB-0) <= ERR)
Flag (I) = 0; % non-Support Vector
End
If (AA> ERR) & (AA <C-ERR) & (ABS (BB-0) <= ERR)
Flag (I) = 2; % standard SVM
End
If (ABS (AA-0) <= ERR) & (BB> ERR) & (BB <C-ERR)
Flag (I) = 2; % standard SVM
End
If (ABS (AA-C) <= ERR) & (ABS (BB-0) <= ERR)
Flag (I) = 1; % boundary Support Vector
End
If (ABS (AA-0) <= ERR) & (ABS (BB-C) <= ERR)
Flag (I) = 1; % boundary Support Vector
End
End
%
%
% -------------------- Calculate the constant term B in the regression equation ---------------------------------
B = 0;
Counter = 0;
For I = 1: l
AA = alpha1 (I );
BB = alpha2 (I );
If (AA> ERR) & (AA <C-ERR) & (ABS (BB-0) <= ERR)
% Calculate the weighted value of the Support Vector
Sum = 0;
For j = 1: l
If flag (j)> 0
Switch tkf
Case 1
Sum = sum + alpha (j) * sum (x (:, j). * X (:, I ));
Case 2
Sum = sum + alpha (j) * (sum (x (:, j). * X (:, I) + C) ^ P;
Case 3
Sum = sum + alpha (j) * exp (-(norm (x (:, j)-X (:, I ))) ^ 2/(2 * Sigma ^ 2 ));
Case 4
Sum = sum + alpha (j) * exp (-norm (x (:, j)-X (:, I)/(2 * Sigma ^ 2 ));
Case 5
Sum = sum + alpha (j) * 1/(1 + exp (-V * sum (x (:, j ). * X (:, I) + C ));
Otherwise
Sum = sum + alpha (j) * exp (-(sum (x (:, j)-X (:, I )). ^ 2)/(2 * Sigma ^ 2 )));
End
End
End
B = Y (I)-sum-Epsilon;
B = B + B;
Counter = counter + 1;
End
If (ABS (AA-0) <= ERR) & (BB> ERR) & (BB <C-ERR)
Sum = 0;
For j = 1: l
If flag (j)> 0
Switch tkf
Case 1
Sum = sum + alpha (j) * sum (x (:, j). * X (:, I ));
Case 2
Sum = sum + alpha (j) * (sum (x (:, j). * X (:, I) + C) ^ P;
Case 3
Sum = sum + alpha (j) * exp (-(norm (x (:, j)-X (:, I ))) ^ 2/(2 * Sigma ^ 2 ));
Case 4
Sum = sum + alpha (j) * exp (-norm (x (:, j)-X (:, I)/(2 * Sigma ^ 2 ));
Case 5
Sum = sum + alpha (j) * 1/(1 + exp (-V * sum (x (:, j ). * X (:, I) + C ));
Otherwise
Sum = sum + alpha (j) * exp (-(sum (x (:, j)-X (:, I )). ^ 2)/(2 * Sigma ^ 2 )));
End
End
End
B = Y (I)-sum + Epsilon;
B = B + B;
Counter = counter + 1;
End
End
If counter = 0
B = 0;
Else
B = B/counter;
End

Function Y = regression (alpha, flag, B, X, Y, tkf, para1, para2, X)
% --------------------------------------------------------------------------
% Regression. m
% Simulation test functions used together with svmnr. M functions
% Function functions:
% This function is equivalent to the analytical equation of the regression equation obtained by the Support Vector. Input a column vector X to be tested to obtain
% Corresponding output values Y
% --------------------------------------------------------------------------
% Input parameter list
% Alpha support vector weighting coefficient (α-α *) vector
% Flag 1 × L tag, 0 corresponds to non-support vector, 1 corresponds to boundary support vector, 2 corresponds to Standard Support Vector
% B constant in Regression Equation
% X original sample data input, n × L matrix, n is the number of variables, L is the number of samples
% Y: original sample data output, 1 × L matrix, and l indicates the number of samples.
The first parameter in the % para1 Kernel Function
The second parameter in the % para2 Kernel Function
% Note: For definitions of core function parameters, see regression. M and svmnr. M.
% X raw data to be tested, column vector of n × 1
% Output parameter list
Output Value of % Y Simulation Test

%
% ----------------------- Initialize the kernel function parameter ------------------------------------
Switch tkf
Case 1
% Linear kernel function K = sum (X. * Y)
% No parameter to be defined
Case 2
% Polynomial kernel function k = (sum (X. * Y) + C) ^ P
C = para1; % C = 0.1;
P = para2; % P = 2;
Case 3
% Radial basis function K = exp (-(norm (x-y) ^ 2/(2 * Sigma ^ 2 ))
Sigma = para1; % Sigma = 6;
Case 4
% Exponent kernel function K = exp (-norm (x-y)/(2 * Sigma ^ 2 ))
Sigma = para1; % Sigma = 3;
Case 5
% Sigmoid core function k = 1/(1 + exp (-V * sum (X. * Y) + C ))
V = paral; % v = 0.5;
C = para2; % C = 0;
Otherwise
% The Custom kernel function must be modified by the user within the function. Note that you need to modify the function several times at the same time!
% Is temporarily defined as K = exp (-(sum (x-y). ^ 2)/(2 * Sigma ^ 2 )))
Sigma = para1; % Sigma = 8;
End
%
%
% ---------------------- Data normalization ---------------------------------------
[X, Minx, Maxx] = premnmx (X );
X = 2 * (X-Minx)./(Maxx-Minx)-1;
[Y, miny, Maxy] = premnmx (y );
%
%
% --------------------- Calculate the simulation test output value ----------------------------------
L = length (alpha );
Sum = 0;
For I = 1: l
If flag (I)> 0
Switch tkf
Case 1
Sum = sum + alpha (I) * sum (X. * X (:, I ));
Case 2
Sum = sum + alpha (I) * (sum (X. * X (:, I) + C) ^ P;
Case 3
Sum = sum + alpha (I) * exp (-(norm (X-X (:, I) ^ 2/(2 * Sigma ^ 2 ));
Case 4
Sum = sum + alpha (I) * exp (-norm (X-X (:, I)/(2 * Sigma ^ 2 ));
Case 5
Sum = sum + alpha (I) * 1/(1 + exp (-V * sum (X. * X (:, I) + C ));
Otherwise
Sum = sum + alpha (I) * exp (-(sum (X-X (:, I). ^ 2)/(2 * Sigma ^ 2 )));
End
End
End
Y = sum + B;
%
%
% -------------------- Inverse normalization processing -------------------------------------------
Y = postmnmx (Y, miny, Maxy );

Support Vector Machine for Nonlinear Regression -- Matlab source code

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.