Regression of BP algorithm

Source: Internet
Author: User
Tags function definition min rand

Recently look at the pattern recognition related books, access to some common machine learning algorithms, although the book for the theory of the algorithm is very clear, but rarely give the specific function definition of the algorithm, so I would like to pass the book's introduction and have someone else's code, self-collation algorithm of MATLAB implementation.

BP algorithm is usually used in three-layer neural network, three layers: input layer, hidden layer, output layer, where the input layer and output layer data are known, but the input and hidden layer between the connection W, hidden layer and the output layer connection v unknown. We first set a set of initial values for the W-V, equivalent to the construction of the network internal parameters, and then give the neural network a set of input, the network in the initial parameters of the transformation generated a set of output, of course, this output can not be the same as your actual output, the next step is to use this set of network output with the actual output This error is then adjusted in turn to W V, so that the output error of the model satisfies the requirements.

function definition of 1.BP algorithm

function [Testingtime, Testingaccuracy,prey] = BP (Train,test, in, HN, Times,limit,alpha,beta)
%input and output must be row vectors
%hn means numbers of Hidden-node
%times and limit give the condition of quit the loop
%alpha is in 0.9~1,beta are in 0.1~3,but youcan adjust them to any value as need

Input=train (1:in,:); Output=train (in+1): End,:);
[M,n]=size (input); [N,n]=size (output);
Normalization of zero-percent input elements
Minx=min (min (input)); Maxx=max (max (input));
input= (Input-minx)/(Maxx-minx);
Normalization of the percent output elements
Miny=min (output); Maxy=max (max (output));
output= (Output-miny)/(Maxy-miny);

W=rand (hn,m) *2-1;v=rand (N,HN) *2-1;
W0=zeros (Size (w)); V0=zeros (Size (v));
epoch=0;
while (Epoch<times)
epoch=epoch+1;
% calculates the output of each neuron in the hidden layer
Neti=w*input;
OI=SIGMF (neti,[1,0]);
% compute output layer for each neuron output
Netk=v*oi;
OK=SIGMF (netk,[1,0]);
% Calculate Global error function
Errtp=output-ok;
ACCURACY=0.5*ERRTP*ERRTP ';
y_delta=errtp.*ok.* (1-OK);
% calculation of hidden layer error functions
Errtp=v ' *y_delta;
Hn_delta=errtp.*oi.* (1-oi);

% adjustment output layer weighting factor
V0=alpha*v0+beta*y_delta*oi ';
V=v+v0;
% adjust the implied layer weighting factor
W0=alpha*w0+beta*hn_delta*input ';
W=w+w0;
% to determine if the error is small enough
If Accuracy<limit
Break
End
End


%BP the second phase of the network (based on the trained w,v and the given input to calculate the output)
Normalization of zero-percent input elements
T0=cputime;
Tinput=test (1:in,:);
tinput= (Tinput-minx)/(Maxx-minx);
Normalization of the percent output elements
Output=test ((in+1): End,:);
output= (Output-miny)/(Maxy-miny);
% compute the output of each neuron in the hidden layer
Neti=w*tinput;
OI=SIGMF (neti,[1,0]);
% compute output for each neuron in the output layer
Netk=v*oi;
OK=SIGMF (netk,[1,0]);
The mean variance of the output is calculated in percent
Errtp=output-ok;
Testingaccuracy=sqrt (ERRTP*ERRTP ');
Percent Output algorithm prediction results
prey=ok* (Maxy-miny) +miny;
T1=cputime;
Testingtime=t1-t0;

2. Validation of an instance of a function run

X=0.1:0.1:5;
Y=power (x,2);
Train=[x; Y];

X=2.1:0.1:4;

Y=power (x,2);

Test=[x;y];
[Estingtime, Testingaccuracy,prey] = BP (train,test,1, 8, 1000,0.001,0.1,0.1);
Figure
Plot (x, y, ' B ', X,prey, ' R ');


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.