Reprint: http://blog.csdn.net/mmc2015/article/details/50976776
Install first and say:
sudo pipinstall Keras
or manually installed:
Download: Git clone git://github.com/fchollet/keras.git
Upload it to the appropriate machine.
Install: CD to the Keras folder and run the Install command:
sudo python setup.py install
Keras in Theano, before learning Keras, first understood this several content:
http://blog.csdn.net/mmc2015/article/details/42222075 (LR)
Http://www.deeplearning.net/tutorial/gettingstarted.html and http://www.deeplearning.net/tutorial/ logreg.html (classifying mnist digits using Logistic regression)
The PLA: http://www.deeplearning.net/tutorial/contents.html
Take the code given in the first link as an example (relatively simple):
[Python] View plain copy import numpy import theano import theano.tensor as T rng = numpy.random n = 400 # training sample size feats = 784 # number of input variables # generate a dataset: D = (Input_values, target_class) d = (rng.randn (N, feats), rng.randint (size=n, low=0, high=2)) training_steps = 10000 &nBsp # declare theano symbolic variables X = t.matrix ("x") y = t.vector ("y") # initialize the weight vector w randomly # # this and the following bias variable b # are shared so they keep their values # between training iterations (Updates) w = theano.shared (Rng.randn (feats), name= "W") # initialize the bias term B = theano.shared (0., name= "B") print ("Initial model:") Print (W.get_value ()) print (B.get_value ()) # construct theano expression graph p_1 = 1 / (1 + t.exp-t.dot (x, w) & NBsP;- B)) # Probability that target = 1 prediction = p_1 > 0.5 # The prediction thresholded xent = -y * t.log (p_1) - (1-y) * t.log (1-p_1) # cross-entropy loss function Cost = xent.mean () + 0.01 * (w ** 2. SUM () # the cost to minimize Gw, gb = t.grad (Cost, [w,  B]) # compute the gradient of the cost &nBsp; # w.r.t weight vector w and # bias term b # (we shall return to this in a