A simple MATLAB project to realize the simple application of three-layer neural network

Source: Internet
Author: User
Tags abs rand

I. Design purpose: To carry out the classification of prime numbers within 1-100


Second, design ideas:

1, the generation of the number within 1-100 and corresponding to the binary

2, the number of parts of the label is 1, the remaining 0

3, select the first 60 groups as training data, after 40 groups testing

4. Select the three-layer neural network, where the hidden and output sections use the Sigmoid function


Third, the code implementation:

1. Test data Generation function

function f = dataset_generator

bits_num = 7;
Prime_table = [2,3,5,7,11,13,17,19,23,29,31,37,41,43,47,53,59,61,67,71,73,79,83,89,97];
Prime_numble = +;
Prime_dataset = zeros (100,8);

%generate Prime number DataSet 1-100 for
count = 1:100
    bin_str = Dec2bin (count,bits_num);
    For i = 1:bits_num
        prime_dataset (count,i) = Str2Num (Bin_str (i));
    End for
    i = 1:prime_numble
        if (count = = prime_table (i))
            prime_dataset (count,bits_num+1) = 1;
        End
    End
    if (Prime_dataset (count,bits_num+1) ~=1)
        prime_dataset (count,bits_num+1) = 0;
    End
End

f = prime_dataset;
        


2, the optimal learning rate selection function, when the number of hidden neurons respectively selected 8 and 15 o'clock to obtain alpha optimal for 1 and 0.1,ask why.

function Test_script1%training Set data = Dataset_generator;
X_train = data (1:60,1:7);
Y_train = data (1:60,8);
x_test = data (61:100,1:7);

y_test = data (61:100,8);
    
    For pow_num = 1:5%learning Rate alpha = 10^ ( -3+pow_num);
    %initialize the network Syn0 = 2*rand (7,15)-1;
    
    SYN1 = 2*rand (15,1)-1;
        %training the network for i = 1:60000 l0 = X_train;
        L1 = sigmoid (l0*syn0);
        L2 = sigmoid (L1*SYN1);
        L2_error = L2-y_train;
        if (i==1) overallerror (i) = Mean (ABS (L2_ERROR));
        End If (mod (i,10000) ==0) overallerror (i/10000+1) = Mean (ABS (L2_ERROR));
        End l2_delta = l2_error.*sigmoid_derivation (L2);
        L1_error = L2_delta*syn1 ';
        L1_delta = l1_error.*sigmoid_derivation (L1);
        Syn1 = syn1-alpha* (L1 ' *l2_delta);
    Syn0 = syn0-alpha* (L0 ' *l1_delta); End Alpha Overallerror End%testing Progress%testing_output = sigmoid (sigmoid (x_teSt*syn0) *syn1)%testing_error = SUM (ABS (Y_test-testing_output)) function S = sigmoid (x) [M,n] = size (x);
    For i = 1:m for j = 1:n s (i,j) = 1/(1+exp (-X (I,J)));
 End End Function S = sigmoid_derivation (x) s = x.* (1-x);


3, main program, including data generation, training test data selection, network training, network testing, results comparison

function Test_script%training Set data = Dataset_generator;
X_train = data (1:60,1:7);
Y_train = data (1:60,8);
x_test = data (61:100,1:7);

y_test = data (61:100,8); %according to result of "test_script1.m"%learning rate% "alpha = 1"---------"number of hidden neurons = 8"% "alpha =

0.1 "---------" number of hidden neurons = "alpha = 0.1";
%initialize the network Syn0 = 2*rand (7,15)-1;

SYN1 = 2*rand (15,1)-1;
    %training the network for i = 1:60000 l0 = X_train;
    L1 = sigmoid (l0*syn0);
    L2 = sigmoid (L1*SYN1);
    L2_error = L2-y_train;
    if (i==1) overallerror (i) = Mean (ABS (L2_ERROR));
    End If (mod (i,10000) ==0) overallerror (i/10000+1) = Mean (ABS (L2_ERROR));
    End l2_delta = l2_error.*sigmoid_derivation (L2);
    L1_error = L2_delta*syn1 ';
    L1_delta = l1_error.*sigmoid_derivation (L1);
    Syn1 = syn1-alpha* (L1 ' *l2_delta);
Syn0 = syn0-alpha* (L0 ' *l1_delta); End Overallerror%testing Progress testing_output = sigmoid (sigmoid (x_test*syn0) *syn1);
Testing_output = round (testing_output);
Testing_error = SUM (ABS (Y_TEST-TESTING_OUTPUT)) for cnt = 61:100 Testing_output (cnt-60,2) = CNT;
End Testing_output Function s = sigmoid (x) [M,n] = size (x);
    For i = 1:m for j = 1:n s (i,j) = 1/(1+exp (-X (I,J)));
 End End Function S = sigmoid_derivation (x) s = x.* (1-x);


Iv. results Analysis and follow-up work: The error rate is very large because of the non-linear characteristic of the prime number, which is different from the previous simple odd-even discriminant results. How the results of neural networks relate to mathematics.


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.