Keras mnist handwritten numeral recognition _keras

Source: Internet
Author: User
Tags shuffle keras

Recently paid attention to a burst of keras, feeling this thing quite convenient, today tried to find it really quite convenient. Not only provide the commonly used algorithms such as layers, normalization, regularation, activation, but also include several commonly used databases such as cifar-10 and mnist, etc.

The following code is Keras HelloWorld bar. Mnist handwritten digit recognition with MLP implementation:

From keras.models import sequential to Keras.layers.core import dense, dropout, activation from Keras.optimizers Imp ORT SGD from keras.datasets import mnist import numpy model = sequential () model.add (dense, 784, init= ' Glorot_u Niform ')] # input layer, 28*28=784 model.add (Activation (' Tanh ')) # activation function is Tanh model.add (Dropout (0.5)) # using 50% dropout Model.add (Dense, init= ' Glorot_uniform ')) # Hidden layer nodes 500 model.add (Activation (' Tanh ')) Model.add (Dropout (0.5)) Model.add (D Ense (init= ' Glorot_uniform ')) # Output is 10 categories, so the dimension is ten Model.add (Activation (' Softmax ')) # last layer with Softmax SGD = SGD (lr =0.01, Decay=1e-6, momentum=0.9, Nesterov=true) # Set the learning rate (LR) and other parameters Model.compile (loss= ' categorical_crossentropy '), OPTIMIZER=SGD, class_mode= ' categorical ') # using cross entropy as loss function (X_train, Y_train), (x_test, y_test) = Mnist.load_data () # using Ker As with the Mnist tool to read data (first need networking) X_train = X_train.reshape (X_train.shape[0], x_train.shape[1] * x_train.shape[2)) # Since the input data dimension of mist is (Num, 28, 28), the following dimensions need to be spelled directly into 784-D x_test = X_test.reshape (X_test.shape[0], x_test.shape[1] * x_test.shape[2]) Y_train = (Numpy.arange () = = Y_train [:, None]). Astype (int) # Refer to the previous article, where you need to convert the index to a one-hot matrix Y_test = (Numpy.arange = = y_test[:, None)). Astype (int) # Start training, here's the argument ratio More. Batch_size is Batch_size,nb_epoch is the number of iterations, shuffle is whether the data randomly disrupted after the training # Verbose is the screen mode, the official said: verbose:0 for no logging to
STDOUT, 1 for progress bar logging, 2 for one log line per epoch. # That means 0 is not screen, 1 is to display a progress bar, 2 is each epoch display a row of data # show_accuracy is to show the correct rate after each iteration # Validation_split is how much is taken to do cross-validation Model.fit (x_ Train, Y_train, batch_size=200, nb_epoch=100, Shuffle=true, verbose=1, Show_accuracy=true, validation_split=0.3) print   ' Test set ' Model.evaluate (X_test, Y_test, batch_size=200, Show_accuracy=true, verbose=1)

The screen prints out such a large pile of things:

Ssh://shibotian@***.***.***.***:22/usr/bin/python-u/usr/local/shared_dir/local/ipython_shibotian/shibotian/  code/kreas_test1/run.py  
Using GPU device 0:tesla k40m  
Train on 42000 from samples, validate on 18000 samples  
Epoch 0  
0/42000 [==============================]-1S-LOSS:0.9894-ACC.: 0.7386-val. Loss:0.4795-val. Acc.: 0.8807  
Epoch 1  
0/42000 [==============================]-1S-LOSS:0.5635-ACC.: 0.8360-val. Loss:0.4084-val. ac C.: 0.8889

omitted .....

Epoch  
0/42000 [==============================]-1S-LOSS:0.2838-ACC.: 0.9116-val. Loss:0.1872-val. A CC.: 0.9418  
Epoch  
0/42000 [==============================]-1S-LOSS:0.2740-ACC.: 0.9163-val. loss:0.1 842-val. Acc.: 0.9434  
Test set  
0/10000 [==============================]-0S-LOSS:0.1712-ACC.: 0.9480
  process finished with exit code 0

P.S. Verbose=1 the progress bar is very cute ah, point of praise

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.