Using Keras to create fitting network to solve regression problem regression_ machine learning

Source: Internet
Author: User
Tags dashed line sin keras

The curve fitting is realized, that is, the regression problem.

The model was created with single input output, and two hidden layers were 100 and 50 neurons.

In the official document of Keras, the examples given are mostly about classification. As a result, some problems were encountered in testing regression. In conclusion, attention should be paid to the following aspects:

1 training data should be matrix type, where the input and output is 1000*1, that is, 1000 samples; Each sample gets an output;

Note: The training data generation is very critical, first of all, we need to check the input data and output data dimension matching;

2 to standardize the data, here is a standard method of 0 mean unit variance. Standardized methods for a variety of training models are fastidious, specific reference to another note: http://blog.csdn.net/csmqq/article/details/51461696;

3 The activation function of the output layer is very important, the output of the fitting has positive negative value, so it is more suitable to choose Tanh.

4 in regression problem, the error function in the training function compile usually chooses mean_squared_error.

5 It is worth noting that the input and output of test data can be plotted during training, which can help debug parameters.

6 The regression problem is realized in Keras, and the accuracy rate of return is 0.

#-*-Coding:utf-8-*-"" "Created on Mon could 13:34:30 2016 @author: Michelle" "from Keras.models import sequential& nbsp;    from Keras.layers.core import dense, activation    from keras.optimizers import SGD from Kera S.layers.advanced_activations Import Leakyrelu from Sklearn import preprocessing from Keras.utils.visualize_plots Import figures import Matplotlib.pyplot as PLT import numpy as np       #part1: Train data   #ge Nerate numbers from-2pi to 2pi     X_train = Np.linspace ( -2*np.pi, 2*np.pi, 1000)   #array: [1000, ]   X_train = Np.array (X_train). Reshape (Len (x_train), 1)) #reshape to the matrix with [100,1] N=0.1*np.random.rand (Len ( X_train), 1 #generate a matrix with the size [Len (x), 1], value in (0,1), array: [1000,1]   Y_train=np.sin (x_train) +n &nbsp 
; #训练数据集: 0 mean unit Variance x_train = Preprocessing.scale (x_train) Scaler = preprocessing. Standardscaler (). Fit (x_train) Y_train = Scaler.transform (y_train) #part2: Test data   X_test = Np.linspace ( -5,5,2000)   x_test = Np.array (x_test). Reshape ((Len (x_test), 1)) Y_test=np.sin (x_test) #零均值单位方差 x_test = Scaler.transform (x_test) #y_test = Scaler.transform (y_test) # #plot testing data #fig, ax = plt . Subplots () #ax. Plot (X_test, Y_test, ' G ') #prediction data X_PRD = Np.linspace ( -3,3,101)   X_PRD = Np.array (X_PRD). Re Shape (len (X_PRD), 1)) X_PRD = Scaler.transform (X_PRD) y_prd=np.sin (X_PRD) #plot testing data fig, ax = plt.subplots () ax. Plot (X_PRD, Y_PRD, ' R ') #part3: Create models, with 1hidden layers     model = sequential ()    &NB
Sp Model.add (Dense, init= ' uniform ', input_dim=1))      #model. Add (Activation (Leakyrelu) ) Model.add (Activation (' Relu ')) model.add (dense)      #model. Add (Activation Leakyrelu (alpha=0.1) ) Model.add (Activation (' Relu ')) Model.add (dense (1))      #model. Add (Activation (leakyrelu )) Model.add (Activation (' Tanh ')) #sgd = SGD (lr=0.01, decay=1e-6, momentum=0.9, Nesterov=true) model.compile (loss= ' mean_squared_error ', optimizer= "Rmsprop" , metrics=["accuracy"]) #model. Compile (loss= ' mean_squared_error ', OPTIMIZER=SGD, metrics=["accuracy"])   Model.fit (X_train, Y_train, nb_epoch=64, batch_size=20, verbose=0)     hist = Model.fit (X_test, Y_test, Batch_ size=10, nb_epoch=100, shuffle=true,verbose=0,validation_split=0.2) #print (hist.history) score = model.evaluate (x_ Test, Y_test, batch_size=10) out = Model.predict (X_PRD, batch_size=1) #plot prediction Data Ax.plot (X_PRD, out, ' k--', l
 w=4) Ax.set_xlabel (' measured ') ax.set_ylabel (' predicted ') plt.show () figures (hist)


The dashed line is the predicted value, and the red is the input value;


The curve function of drawing error values with the number of iterations is visualize_plots.py,

1) put it under the C:\Anaconda2\Lib\site-packages\keras\utils.

2 when used, you need to add this sentence: From keras.utils.visualize_plots import figures, and then call function figures (hist) directly in the program.

The implementation code for the Gai function is:

#-*-Coding:utf-8-*-"" "Created on the Sat may 22:26:24 2016 @author: Shemmy" "" Def figures (History,figure_name=) PLO TS "" "" "" method to visualize accuracies and loss vs Epoch for training as OK as Testind data\n Argumets:hi Story = A instance returned by Model.fit method\n Figure_name = A string representing file name to Plots. By default it are set to "plots" \ n usage:hist = model.fit (x,y) \ n Figures (hist) "" from keras.cal Lbacks Import History if Isinstance (history,history): Import matplotlib.pyplot as Plt hist = Histo Ry.history Epoch = History.epoch acc = hist[' acc '] loss = hist[' loss '] Val_lo SS = hist[' Val_loss '] Val_acc = hist[' Val_acc '] plt.figure (1) plt.subplot (221) Plt.plot (     

        EPOCH,ACC) Plt.title ("Training Accuracy vs Epoch") Plt.xlabel ("Epoch") Plt.ylabel ("accuracy") Plt.subPlot (222) Plt.plot (Epoch,loss) plt.title ("Training Loss vs Epoch") Plt.xlabel ("Epoch") plt
        . Ylabel ("Loss") Plt.subplot (223) plt.plot (EPOCH,VAL_ACC) plt.title ("Validation ACC vs Epoch") Plt.xlabel ("Epoch") Plt.ylabel ("Validation accuracy") plt.subplot (224) Plt.plot (Epoch,va  
        L_loss) Plt.title ("Validation Loss vs Epoch") Plt.xlabel ("Epoch") Plt.ylabel ("Validation loss") Plt.tight_layout () plt.savefig (figure_name) else:print "Input Argument is not a instance of
 Class History "


Discuss the Keras in the implementation of regression problems in the post: https://github.com/fchollet/keras/issues/108

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.