Install first and say:
sudo pip install Keras
or manually installed:
Download: Git clone git://github.com/fchollet/keras.git
Upload it to the appropriate machine.
Install: CD to the Keras folder and run the Install command:
sudo python setup.py install
Keras in Theano, before learning Keras, first understood this several content:
http://blog.csdn.net/mmc2015/article/details/42222075 (LR)
Http://www.deeplearning.net/tutorial/gettingstarted.html and http://www.deeplearning.net/tutorial/ logreg.html (classifying mnist digits using Logistic regression)
The PLA: http://www.deeplearning.net/tutorial/contents.html
Take the code given in the first link as an example (relatively simple):
Import numpy import Theano import theano.tensor as T rng = Numpy.random N = # train ing sample size feats = 784 # number of input variables # generate a dataset:d = (input_va Lues, Target_class) D = (Rng.randn (N, feats), Rng.randint (Size=n, low=0, high=2)) Training_steps = 10000 # Declare Theano Symbolic variables x = T.matrix ("x") y = t.vector ("y") # Initialize the weight vector w randomly # this and the follow ing bias variable B are shared so they keep their values # between training iterations (updates) W = theano.shared (RNG.R ANDN (feats), name= "W") # Initialize bias term B = theano.shared (0., name= "B") print ("Initial model:") Print (W.get_va Lue ()) print (B.get_value ()) # construct Theano expression Graph P_1 = 1/(1 + t.exp (-t.dot (x, W)-B)) # probability T Hat target = 1 prediction = p_1 > 0.5 # the prediction thresholded xent =-y * T.log (p_1)-(1-y) * T.log (1-p_1) # cross-entRopy loss Function cost = Xent.mean () + 0.01 * (w * * 2). SUM () # The cost to minimize GW, GB = T.grad (Cost, [w, b])
# Compute The gradient of the cost # w.r.t weight vector W and
# bias Term B # (we shall return to this in a # following section of this tutorial) # Compile train = theano.function (i Nputs=[x,y], outputs=[prediction, Xent], updates= (w, w-0.1 * GW), (b, b-0.1 * GB)) predict = the Ano.function (Inputs=[x], outputs=prediction) # Train for-I in Range (training_steps): pred, err = Train (d[0), d[1]) p Rint (Final model:) Print (W.get_value ()) print (B.get_value ()) print ("target values for D:") print ("d[1]" Prediction on D: ") print (Predict (d[0))
We found that building a model using Theano typically requires the following steps:
0) Preprocessing data
# Generate a dataset:d = (input_values, target_class)
1) Define Variables
# Declare Theano Symbolic variables
2) Building (diagram) model
# construct Theano Expression graph
3) compiling model, theano.function ()
# Compile
4) Training Model
5) Forecasting New data
# Train
Print (Predict (d[0]))
So, what's the difference between Theano and Keras?
http://keras.io/
The original is different levels, Keras package better, more convenient programming (debugging more trouble.) Theano programming is more flexible, customization completely no problem, suitable for scientific researchers ah.
In addition, Keras and TensorFlow are fully compatible ...
Keras has two types of models, sequences and graphs, which are not explained.
Let's look at how fast the Keras build model is, taking the sequence as an example:
From keras.models import sequential
model = sequential () #1定义变量
from keras.layers.core import dense, activation< C2/>model.add (Dense (output_dim=64, input_dim=100, init= "Glorot_uniform")) #2构建图模型
model.add (Activation ("Relu") ))
Model.add (Dense (output_dim=10, init= "Glorot_uniform"))
Model.add (Activation ("Softmax"))
from Keras.optimizers Import SGD
model.compile (loss= ' categorical_crossentropy ', OPTIMIZER=SGD (lr=0.01, momentum=0.9 , nesterov=true)) #3编译模型
model.fit (X_train, Y_train, nb_epoch=5, batch_size=32) #4训练模型
Objective_score = Model.evaluate (X_test, Y_test, batch_size=32)
classes = model.predict_classes (X_test, batch_size=32) #5预测模型
Proba = Model.predict_proba (x_test, batch_size=32)
Finally give the Keras framework, to learn it yourself: