C # BP neural network and example (2) Improved Version

Source: Internet
Author: User

In the <C # BP Neural Network Class and example> Article, I reproduced a class provided by netizens. I am very grateful to this netizen for his selfless dedication, so that the author can quickly complete a function of the software.

It seems that I have found some shortcomings in the original version modified by netizens, And I have improved it accordingly.

1. The training sample result is not saved, so that the training data is re-trained every time you use it, which takes time:

Although methods for saving arrays W, V, B1, and b2 are provided in the class, the constructor is faulty, and the data used in the following sections are not saved: in_rate innum hidenum outnum.

The author has made the following improvements:

A new parameter-free constructor is created to predict the instantiated object when a new sample is created;

The saveparas and readparas methods are provided to save and read these parameters;

The initial method is provided to create a dynamic array used in the prediction process.

2. Use pseudo-random numbers in the class to initialize the W and V arrays, so that W and V are different each time a new object is created. I have not studied this algorithm in depth, and I don't know why it was designed.

The author puts

R = new random ();

This line of code has been modified

R = new random (32); // Add a parameter to make the generated pseudo-random sequence the same

The modified classes and examples are as follows:

Using system; using system. IO; using system. Text; namespace bpannet {/// <summary> /// bpnet abstract description. /// </Summary> public class bpnet {public int innum; // number of input nodes int hidenum; // number of hidden layer nodes public int outnum; // output layer node count public int samplenum; // total number of samples random R; double [] X; // input data of input nodes double [] x1; // The output double [] X2 of the hidden layer node; // The output double [] O1 of the output node; // The input double [] O2 of the hidden layer; // input public double [,] W in the output layer; // weight matrix W Public double [,] V; // weight matrix v Public double [,] dw; // weight matrix W Public double [,] DV; // weight matrix v Public double rate; // learning rate public doub Le [] B1; // hidden layer Threshold Matrix public double [] B2; // output layer Threshold Matrix public double [] db1; // hidden layer Threshold Matrix public double [] DB2; // output layer Threshold Matrix double [] pp; // output layer error double [] QQ; // hidden layer error double [] YD; // The Public Double E of the instructor data in the output layer; // The Mean Squared Error double in_rate; // The normalized proportional coefficient public int computehidenum (int m, int N) {double S = math. SQRT (0.43 * m * n + 0.12 * n + 2.54 * m + 0.77 * n + 0.35) + 0.51; int Ss = convert. toint32 (s); Return (S-(double) ss)> 0.5 )? SS + 1: SS;} public bpnet (double [,] P, double [,] T) {// constructor logic r = new random (32 ); // Add a parameter to make the pseudo-random sequence identical this. innum = P. getlength (1); // The second-dimensional size of the array is the number of input nodes. This. outnum = T. getlength (1); // number of output nodes this. hidenum = computehidenum (innum, outnum); // The number of hidden nodes. I do not know how it works. // This. hidenum = 18; this. samplenum = P. getlength (0); // the size of the first dimension of the array is console. writeline ("Number of input nodes:" + innum); console. writeline ("number of hidden layer nodes:" + hidenum); conso Le. writeline ("number of output layer nodes:" + outnum); console. readline (); X = new double [innum]; X1 = new double [hidenum]; x2 = new double [outnum]; O1 = new double [hidenum]; o2 = new double [outnum]; W = new double [innum, hidenum]; V = new double [hidenum, outnum]; DW = new double [innum, hidenum]; DV = new double [hidenum, outnum]; b1 = new double [hidenum]; b2 = new double [outnum]; db1 = new double [hidenum]; DB2 = new Double [outnum]; pp = new double [hidenum]; QQ = new double [outnum]; YD = new double [outnum]; // initialize W for (INT I = 0; I <innum; I ++) {for (Int J = 0; j 

Example:

// The main caller using system; namespace bpannet {/// <summary> /// class1 summary. /// </Summary> class class1 {/// <summary> /// main entry point of the application. /// </Summary> [stathread] Static void main (string [] ARGs) {// double [,] p1 = new double [,] {0.05, 0.02 }, {0.09, 0.11}, {0.12, 0.20}, {0.15, 0.22}, {0.20, 0.25}, {0.75, 0.75}, {0.80, 0.83}, {0.82, 0.80 },{ 0.90, 0.89 },{ 0.95, 0.89 },{ 0.09, 0.04 },{ 0.1, 0.1 },{ 0.14, 0.21 },{ 0.18, 0.24 }, {0.22, 0.28 },{ 0.77, 0.78 },{ 0.79, 0.81 },{ 0.84, 0.82 },{ 0.94, 0.93 },{ 0.98, 0.99 }}; // double [,] T1 = new double }, }, {0, 1}, {0, 1}, {0, 1}; double [,] p1 = new double [,] {0.1399, 0.1467, 0.1567, 0.1595, 0.1588, 0.1622}, {0.1467, 0.1567, 0.1595, 0.1588, 0.1622, 0.1611}, {0.1567, 0.1595, 0.1588, 0.1622, 0.1611}, {0.1615, 0.1595, 0.1588, 0.1622, 0.1611, 0.1615, 0.1685 },{ 0.1588, 0.1622, 0.1611, 0.1615, 0.1685, 0.1789 }}; double [,] T1 = new double [,] {0.1622}, {0.1611 }, {0.1615 },{ 0.1685 },{ 0.1789 },{ 0.1790 }}; bpnet BP = new bpnet (P1, T1); int study = 0; do {study ++; BP. train (P1, T1); // bp. rate = 0.95-(0.95-0.3) * study/50000; // console. write ("Number" + Study + "Times:"); // console. writeline ("mean variance is" + bp. e);} while (BP. e> 0.001 & Study <50000); console. write ("Number" + Study + "Times:"); console. writeline ("mean variance is" + bp. e); BP. savematrix (BP. w, "TXT"); BP. savematrix (BP. v, "v.txt"); BP. savematrix (BP. b1, "b1.txt"); BP. savematrix (BP. b2, "b2.txt"); BP. saveparas ("para.txt"); pretect (); // start prediction of new samples} public static void pretect () {console. writeline ("prediction starts... "); bpnet BP = new bpnet (); BP. readparas ("para.txt"); BP. initial (); BP. readmatrixw (BP. w, "TXT"); BP. readmatrixw (BP. v, "v.txt"); BP. readmatrixb (BP. b1, "b1.txt"); BP. readmatrixb (BP. b2, "b2.txt"); // double [,] P2 = new double [,] {0.05, 0.02}, {0.09, 0.11}, {0.12, 0.20 }, {0.15, 0.22}, {0.20, 0.25}, {0.75, 0.75}, {0.80, 0.83}, {0.82, 0.80}, {0.90, 0.89}, {0.95, 0.89 },{ 0.09, 0.04 },{ 0.1, 0.1 },{ 0.14, 0.21 },{ 0.18, 0.24 },{ 0.22, 0.28 },{ 0.77, 0.78 }, {0.79, 0.81 },{ 0.84, 0.82 },{ 0.94, 0.93 },{ 0.98, 0.99 }}; double [,] P2 = new double [,] {0.1399, 0.1467, 0.1567, 0.1595, 0.1588, 0.1622 },{ 0.1622, 0.1611, 0.1615, 0.1685, 0.1789 }}; int AA = bp. innum; int BB = bp. outnum; int cc = p2.getlength (0); double [] P2 = new double [AA]; double [] T2 = new double [BB]; for (INT n = 0; n <cc; n ++) {for (INT I = 0; I <AA; I ++) {p21' [I] = P2 [N, I];} t2 = bp. SIM (p21.); For (INT I = 0; I <t2.length; I ++) {console. writeline ("prediction data" + N. tostring () + ":" + T2 [I] + "") ;}} console. readline ();}}}

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.