1. Introduction
Keras is a Theano based framework for deep learning, designed to refer to torch, written in Python, and is a highly modular neural network library that supports GPU and CPU. Keras Official document Address
2. Process
First, use CNN for training, use the Theano function to remove the full link of the CNN, and train the SVM
3. Results Example
Because this is just a demo keras& SVM demo, the parameters are not too many attempts, the result is generally
4. Code
Due to Keras documents, code updates, the current online a lot of code can not be used, the following paste my code, you can run directly
From keras.models import sequential to Keras.layers.core import dense, dropout, activation,flatten from keras.layers.co Nvolutional import convolution2d, maxpooling2d from keras.optimizers import SGD to keras.datasets import mnist from Kera S.layers Import batchnormalization from SKLEARN.SVM import SVC import Theano from keras.utils import np_utils def SVC (tr Aindata,trainlabel,testdata,testlabel): Print ("Start training SVM ...") SVCCLF = SVC (c=1.0,kernel= "RBF", cache_size=
3000) Svcclf.fit (traindata,trainlabel) Pred_testlabel = svcclf.predict (testdata) num = Len (pred_testlabel) accuracy = Len ([1 for I in range (num) if testlabel[i]==pred_testlabel[i]]/float (num) print ("CNN-SVM accuracy:", ACCU Racy) #each add as one layer model = sequential () #1. Use Convolution,pooling,full connection Model.add (5,
3, 3,border_mode= ' valid ', input_shape= (1, a), activation= ' Tanh ')) Model.add (Maxpooling2d (2, 2)) Model.add (convolution2d, 3, 3,activation= ' Tanh ') Model.add (Maxpooling2d (pool_size= (2, 2)) Model.add (Flatten ()) Model.add (dense ' Tanh ') #Full connection model.add (dense (10,activation= ' Softmax ')) #2. Just only user full connection # model.add ( 100,input_dim = 784, init= ' uniform ', activation= ' Tanh ')) # Model.add (dense ' 100,init= ', Uniform ' activation= ')) #
Model.add (Dense (10,init= ' uniform ', activation= ' Softmax ')) # SGD = SGD (lr=0.2, decay=1e-6, momentum=0.9, Nesterov=true) Model.compile (optimizer= ' sgd ', loss= ' categorical_crossentropy ') (X_train, Y_train), (x_test, y_test) = Mnist.load_ Data () #change data type,keras category need ont hot #2 reshape #X_train = X_train.reshape (x_train.shape[0],x_train.shape[ 1]*X_TRAIN.SHAPE[2]) #X_train. shape[0] 60000 x_train.shape[1] x_train.shape[2] #1 reshape X_train = X_train.reshape
(X_train.shape[0],1,x_train.shape[1],x_train.shape[2]) Y_train = Np_utils.to_categorical (Y_train,) #new label for SVM y_train_new = y_train[0:42000] Y_test_new = y_train[42000:] #new train and test data x_train_new = x_train[0:42000] X_test = x_train[42000:] y_train_new = y_train[0:420 Y_test = y_train[42000:] Model.fit (X_train_new, Y_train_new, batch_size=200, Nb_epoch=100,shuffle=true, verbose=1, Show_accuracy=true, validation_split=0.2) print ("Validation ...") Val_loss,val_accuracy = Model.evaluate (X_test, Y_ Test, batch_size=1,show_accuracy=true) print "Val_loss:%f"%val_loss print "val_accuracy:%f"%val_accuracy #define Thea No funtion to get output of FC layer get_feature = Theano.function ([Model.layers[0].input],model.layers[5].get_output ( Train=false), allow_input_downcast=false) fc_train_feature = Get_feature (x_train_new) FC_test_feature = Get_feature (X _test) Svc (fc_train_feature,y_train_new,fc_test_feature,y_test_new)