130 lines of code implementation of BP neural network principle and application example

Source: Internet
Author: User
Tags min vmin

Optimization algorithm is an important part of machine learning, BP Neural network is the foundation of deep Learning, BP neural network principle is very simple, almost can be understood as a logistic regression of a set way, in the previous blog post, I use r language to achieve several optimization algorithms, Based on the principle of neural network, the BP Neural network is realized by using the particle swarm optimization (PSO) algorithm as a tool, and the result of using the neural network is illustrated by an example of the credit score of UCI (click Open link to download the data).

##### #根据神经网络的原理利用R语言来实现神经网络 # #神经网络输入层, hidden layer, output layer parameter settings nclass<-c (14,5,5,1); #激活函数, this example selects the hyperbolic sine function sigmoid<-function (x) {return (exp (x)-exp (-X))/(exp (x) +exp (×))} #向前计算 neuralnet<-
		function (data,parameter) {data<-scale (data);
		R_<-nrow (data[,-15]);
        C_<-ncol (data[,-15]);
		Whide1<-matrix (parameter[1: (c_*nclass[2])],nrow=c_,ncol=nclass[2]); Whide2<-matrix (parameter[(c_*nclass[2]+1):(c_*nclass[2]+nclass[2]*nclass[3])],nrow=nclass[2],ncol=nclass[3]
		); wout<-parameter[(c_*nclass[2]+nclass[2]*nclass[3]+1):(c_*nclass[2]+nclass[2]*nclass[3]+nclass[3]*nclass[4])
		];
		#计算各个节点值?
        In_value<-as.matrix (data[,-15]);
        Hide1_value<-sigmoid (IN_VALUE%*%WHIDE1);
        Hide2_value<-sigmoid (HIDE1_VALUE%*%WHIDE2);
    Out_value<-sigmoid (hide2_value%*%wout);
return (Out_value); # # # # # # # # # ############### #PSO算法实现 ################ # #初始化PSO算法参数 #收敛因子k (similar to the effect of inertia coefficients) #学习因子初始化k *phi_1,k*phi_2,phi<-phi_ 1+phi_2 #粒子群位置初始化w, as well as Wmin,wmax #粒子群速度初始化v, and Vmin, Vmax k<-0.729 phi_1<-2.05 phi_2<-2.05 #以上参数为参照前人的经验 #初始化参数的函数 inifunc<-function (Dim,wmin,wmax,vmin,vmax
            , n) {g<-rep (na,2*dim+1);
            A<-matrix (runif (N*dim,wmin,wmax), Nrow=n,ncol=dim);
            B<-matrix (runif (N*dim,vmin,vmax), Nrow=n,ncol=dim);
            C<-apply (A,1,func);
    G<-cbind (A,B,C) [Which (C==min (C)),];
Return (Rbind (Cbind (a,b,c), G));
    } # #所有粒子历史最优搜索函数Gbest) gbestfunc<-function (x,dim,n) {c<-x[-(n+1), 2*dim+1];
Return (Rbind (x[-(n+1),],x[which (C==min (C)),]));
              } # #PSO Core function psoafunc<-function (x,dim,wmin,wmax,vmin,vmax,n) {a<-x[-(n+1), 1:dim];
              b<-x[-(n+1), (dim+1):(2*dim)];
              c<-x[-(n+1), 2*dim+1];
                           For (i in 1:n) {neww<-a[i,]+b[i,]; For (j in 1:dim) {if (Neww[j]>wmax | | neww[j]vmax | | B[i,k]vmax | |
            b[i,k]0) {X[i,j]=1} else  X[i,j]=0}} return (x)} a<-interfunc (200,100,0,1,-0.2,0.2,100) Table (Option01 (neuralnet (DAT	
			 A_test,a)), data_test[,15])

If there is any mistake, please correct me.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.