Using Pybrain library for neural network function fitting __ function

Source: Internet
Author: User
Tags network function sin

Pybrain is a well-known Python neural network library, today I used it to do an experiment, referring to this blog, thanks to the original author, gave a specific implementation, the code can be directly copied to run.
Our main problems are as follows:

First we give a function to construct the dataset that is required to generate this problem .

Def generate_data (): "" "
    generate original data of U and Y" ""
    u = Np.random.uniform ( -1,1,200)
    y=[]
    Former_y_value = 0 for
    i in Np.arange (0,200):
        y.append (former_y_value)
        next_y_value = (29/40) * Np.sin (
   (* U[i] + 8 * former_y_value)/(3 + 4 * (u[i] * * * 2) + 4 * (Former_y_value * * 2))) +
                       (2/10) * U[i] + (2/10 ) * Former_y_value
        former_y_value = next_y_value return
    u,y

The function of this question is drawn in this way:

Our example is to use the first 100 points of training, the last 100 points as a prediction. The basic steps of constructing the Pybrain neural network: Constructing the neural network structure Data set training neural networks result visualization verification and analysis constructing neural net

The process of building a neural network is very clear, set several levels, several nodes, are simple and clear, see it once.

Import NumPy as NP
import Matplotlib.pyplot as plt from

pybrain.structure import * from
pybrain.datasets impo RT Superviseddataset from
pybrain.supervised.trainers import backproptrainer

# Createa neural network
FNN = Feedforwardnetwork ()

# Create three layers, input layer:2 input unit; hidden layer:10 units; output layer:1 output
Inlayer = Linearlayer (2, name= ' Inlayer ')
HiddenLayer0 = Sigmoidlayer (name= ' HiddenLayer0 ') outlayer
= Linearlayer (1, name= ' Outlayer ')

# Add three layers to the neural network
(fnn.addinputmodule)
Fnn.addmodule (HIDDENLAYER0)
fnn.addoutputmodule (outlayer)

# link three layers
= Fullconnection (INLAYER,HIDDENLAYER0)
hidden0_to_out = Fullconnection (HiddenLayer0, Outlayer)

# Add the Links to Neural network
fnn.addconnection (in_to_hidden0)
fnn.addconnection (hidden0_to_out)

# Make Neural network come into effect
fnn.sortmodules ()
Building Data sets

We choose 2 input 1 output, 80% for training, 20% for forecasting

# definite the dataset as two input, one output
DS = Superviseddataset (2,1)

# Add data element to the dataset
  
   for I in Np.arange (199):
    ds.addsample ([u[i],y[i]],[y[i+1]])

# You can get your input/output this way
X = ds[' Input ']
Y = ds[' target ']

# split the dataset into train dataset and test DataSet
Datatrain, datatest = Ds.spli Twithproportion (0.8)
xtrain, ytrain = datatrain[' input '],datatrain[' target ']
xtest, ytest = datatest[' Input '], datatest[' target '
  
Training Neural Networks

Let's give him a 1000-time iteration.

# Train The NN
# we use BP algorithm
# verbose = True means print th total error
trainer = Backproptrainer ( FNN, Datatrain, verbose=true,learningrate=0.01) # Set the Epoch Times to make the
NN fit
Trainer.trainuntilconvergence (maxepochs=1000)
Visualization of results

We're using Matlibplot to draw the predictions and the actual values.

Predict_resutl=[] for
i in Np.arange (Len (xtest)):
    predict_resutl.append (Fnn.activate (xtest[i)) [0])
Print (PREDICT_RESUTL)

plt.figure ()
Plt.plot (Np.arange (0,len (xtest)), Predict_resutl, ' ro--', label= ' Predict number ')
Plt.plot (Np.arange (0,len (xtest)), Ytest, ' ko-', label= ' true number ')
plt.legend ()
Plt.xlabel ("x")
Plt.ylabel ("y")

plt.show ()

Let's make a prediction with this topic and draw the following figure
Analysis

For mod in fnn.modules:
  print ("Module:", mod.name)
  if Mod.paramdim > 0:
    print ("--parameters:", Mod.params) for
  Conn in fnn.connections[mod]:
    print ("-connection to", Conn.outmod.name)
    if Conn.paramdim > 0:
       print ("-parameters", conn.params)
  if Hasattr (FNN, "Recurrentconns"):
    print ("Recurrent Connections ") for
    Conn in Fnn.recurrentconns:
       print ("-", Conn.inmod.name," to ", Conn.outmod.name)
       if Conn.paramdim > 0:
          print ("-parameters", Conn.params)

It can print out the specific information of the neural network, the result is as follows:

Module:hiddenlayer0
-connection to Outlayer
-parameters [ -0.48485978  1.94439991-1.1686299  - 1.01764515-1.04221    -0.78088745
  0.27321985-1.76426041  2.0747614   1.98425053]
module:inlayer
-connection to HiddenLayer0
-parameters [1.48125364-0.97942827  4.7258546   2.08059918- 1.96960441-0.03098871
  0.52430318  1.64983933  0.43738152  1.95122015  0.81952423- 0.24019787
 -0.86026329  0.63505556  0.53870484  0.94078527  1.42263437  1.87720358
 -1.12582038  0.70344489]
Module:outlayer
The complete code

Finally, I post the complete code, and note that you have to install Pybrain first.

Import NumPy as NP import Matplotlib.pyplot as PLT from pybrain.structure Import * from pybrain.datasets Import supervise Ddataset from pybrain.supervised.trainers import Backproptrainer def generate_data (): "" "Generate original data of U and Y "" "U = Np.random.uniform ( -1,1,200) y=[] Former_y_value = 0 for I in Np.arange (0,200): Y.appe nd (former_y_value) next_y_value = (29/40) * Np.sin ((u[i) + 8 * former_y_value)/(3 + 4 * (u[i ] * * * 2) + 4 * (Former_y_value * * 2))) \ + (2/10) * U[i] + (2/10) * Former_y_value forme  R_y_value = Next_y_value return U,y # Obtain the original data U,y = Generate_data () # Createa neural network FNN = Feedforwardnetwork () # Create three layers, input layer:2 input unit; Hidden layer:10 units; Output layer:1 Output Inlayer = Linearlayer (2, name= ' Inlayer ') HiddenLayer0 = Sigmoidlayer (name= ' HiddenLayer0 ') Outla Yer = Linearlayer (1, name= ' Outlayer ') # Add three Layers to the neural network Fnn.addinputmodule (inlayer) fnn.addmodule (HIDDENLAYER0) Fnn.addoutputmodule (outlayer) # Li NK three Layers In_to_hidden0 = fullconnection (INLAYER,HIDDENLAYER0) hidden0_to_out = Fullconnection (HiddenLayer0, Outlayer) # Add the links to neural network fnn.addconnection (in_to_hidden0) fnn.addconnection (hidden0_to_out) # make NE Ural network come into effect fnn.sortmodules () # Definite the dataset as two input, one output DS = Superviseddataset (2  , 1) # Add data element to the dataset for I-Np.arange (199): Ds.addsample ([u[i],y[i]],[y[i+1]]) # you can get your Input/output this way X = ds[' input '] Y = ds[' target '] # split the dataset into train dataset and test dataset Datatrain , datatest = ds.splitwithproportion (0.8) Xtrain, Ytrain = datatrain[' input '],datatrain[' target '] xtest, ytest = datatest[ ' Input '], datatest[' target '] # train The NN # we use BP algorithm # verbose = True means print th total error trainer = B Ackproptrainer (FNN, Datatrain, verbose=true,learningrate=0.01) # Set the Epoch Times to make the NN fit Trainer.trainuntilconvergence (maxepochs=1000) # p Rediction = Fnn.activate (xtest[1]) # print ("The prediction number is:", prediction, "The real number is:", ytest[1]) pred

Ict_resutl=[] for I in Np.arange (Len (xtest)): Predict_resutl.append (Fnn.activate (xtest[i)) [0]) print (PREDICT_RESUTL) Plt.figure () Plt.plot (Np.arange (0,len (xtest)), Predict_resutl, ' ro--', label= ' predict number ') Plt.plot (0, Len (xtest)), Ytest, ' ko-', label= ' true number ') Plt.legend () Plt.xlabel ("X") Plt.ylabel ("Y") plt.show () for mod in FNN.M Odules:print ("Module:", mod.name) if Mod.paramdim > 0:print ("--parameters:", Mod.params) for Conn in FNN. Connections[mod]: Print ("-connection to", conn.outmod.name) if Conn.paramdim > 0:print ("-Parameters" , conn.params) if Hasattr (FNN, "Recurrentconns"): Print ("recurrent connections") for Conn in Fnn.recurrentconns : Print ("-", Conn.inmoD.name, "to", Conn.outmod.name) if Conn.paramdim > 0:print ("-Parameters", Conn.params) 

Article reference:
[1] using Pybrain Library for neural network fitting

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.