Examples of Keras (start)

Source: Internet
Author: User
Tags save file valid keras

Example of Keras (start):

1 Multi-class Softmax based on multilayer perceptron:

From keras.models import sequential from
keras.layers import dense, dropout, activationfrom keras.optimizers import S GD
model = sequential ()
# Dense (a) is a fully-connected layer with a hidden units.
# in the first layer, you must specify the expected input data shape:
# here, 20-dimensional vectors.
Model.add (Dense (input_dim=20, init= ' uniform ')) Model.add (
Activation (' Tanh ')) Model.add (
Dropout (0.5))
Model.add (Dense (init= ' uniform ')) Model.add (Activation (
' Tanh ')) Model.add (
Dropout (0.5)
) Model.add (Dense (ten, init= ' uniform '))
Model.add (Activation (' Softmax '))

sgd = SGD (lr=0.1, decay=1e-6, momentum=0.9, Nesterov=true)
model.compile (loss= ' categorical_crossentropy ',
                           optimizer=sgd,
                           metrics=[' accuracy ')
model.fit (X_train, Y_train,
               nb_epoch=20,
               batch_size=16)
score = Model.evaluate (X_test, Y_test, batch_size=16)

2 Another implementation of similar MLP:

Model = sequential ()
model.add (Dense (input_dim=20, activation= ' Relu '))
Model.add (Dropout (0.5)
) Model.add (Dense (activation= ' relu ')) Model.add (Dropout
(0.5)) Model.add (
dense, activation= ' Softmax ' ))
model.compile (loss= ' categorical_crossentropy ',
                           optimizer= ' Adadelta ',
                           metrics=[' accuracy '])

3 Multilayer perceptron for two classification

Model = sequential ()
model.add (Dense (input_dim=20, init= ' uniform ', activation= ' Relu '))
Model.add ( Dropout (0.5))
Model.add (dense (, activation= ' Relu ')) Model.add (Dropout
(0.5))
Model.add (dense (1, activation= ' sigmoid '))
model.compile (loss= ' binary_crossentropy ',
                            optimizer= ' Rmsprop ',
                            metrics=[ ' Accuracy '])

4 convolutional neural Networks similar to VGG:

From keras.models import sequential
From keras.layers import dense, dropout, Activation, Flatten
From keras.layers import convolution2d, maxpooling2dfrom keras.optimizers import SGD
Model = sequential () # INPUT:100X100 images with 3 channels (3, +) tensors.# This applies convolution filt ERs of size 3x3 Each.model.add (convolution2d (3, 3, border_mode= ' valid ', input_shape= (3,))) Model.add ( Activation (' Relu ')) Model.add (convolution2d (32, 3, 3))
Model.add (Activation (' Relu ')) Model.add (Maxpooling2d (pool_size= (2, 2)) Model.add (Dropout (0.25))
Model.add (convolution2d (3, 3, border_mode= ' valid ')) Model.add (Activation (' Relu ')) Model.add (convolution2d (64, 3, 3)) Model.add (Activation (' Relu ')) Model.add (Maxpooling2d (pool_size= (2, 2)) Model.add (Dropout (0.25))
Model.add (Flatten ()) # Note:keras does automatic shape inference.model.add (dense) model.add (Activation (' Relu ')) Model.add (Dropout (0.5))
Model.add (dense (ten)) Model.add (Activation (' Softmax '))
SGD = SGD (lr=0.1, decay=1e-6, momentum=0.9, Nesterov=true) model.compile (loss= ' categorical_crossentropy ', optimizer= sgd
Model.fit (X_train, Y_train, batch_size=32, nb_epoch=1)

5 Sequence classification using LSTM

From Keras.models  import sequential from
keras.layers  import Dense, dropout, Activation
from Keras.layers  Import embedding from
keras.layers   import LSTM

model = sequential ()
Model.add ( Embedding (Max_features, Input_length=maxlen))
Model.add (LSTM (output_dim=128, activation= ' sigmoid ', inner_ activation= ' hard_sigmoid '))
Model.add (Dropout (0.5))
Model.add (dense (1))
Model.add (Activation (' Sigmoid ')

model.compile (loss= ' binary_crossentropy ',
                          optimizer= ' Rmsprop ',
                          metrics=[' accuracy '))

Model.fit (X_train, Y_train, batch_size=16, nb_epoch=10)
score = Model.evaluate (X_test, Y_test, batch_size= 16)

Use recursive units with thresholds for image description:

Note that for this network to work well, a larger volume of convolutional neural networks is required and initialized with pre-trained weights, which are only examples of structs.

Max_caption_len =
vocab_size = 10000 # First, let's

define an image model that # would
encode pictures into 128-dimensional vectors.
# It should is initialized with pre-trained weights.
Image_model = Sequential ()
Image_model.add (convolution2d (3, 3, border_mode= ' valid ', input_shape= (3, 100, 100)) )
Image_model.add (Activation (' Relu '))
Image_model.add (convolution2d (3, 3))
Image_model.add ( Activation (' Relu '))
Image_model.add (Maxpooling2d (pool_size= (2, 2))

Image_model.add (convolution2d (3, 3, border_mode= ' valid ')) Image_model.add (Activation (' Relu ')) Image_model.add ( Convolution2d (3, 3)) Image_model.add (Activation (' Relu ')) Image_model.add (Maxpooling2d (pool_size= (2, 2)) Image_
Model.add (Flatten ()) Image_model.add (Dense ()) # Let's load the weights from a save file. Image_model.load_weights (' Weight_file.h5 ') # Next, let's define a RNN model that encodes sequences of words # into Sequenc
Es of 128-dimensional word vectors. Language_model = Sequential () Language_model.add (Embedding (Vocab_size,, Input_length=max_caption_len)) Language_ Model.add (GRU (output_dim=128, Return_sequences=true)) Language_model.add (timedistributed (Dense) # Let's repeat
The image vector to turn it into a sequence. Image_model.add (Repeatvector (Max_caption_len)) # The output of both models would be tensors of shape (samples, max_caption_
Len, 128).
# let ' s concatenate these 2 vector sequences. Model = sequential () model.add (Merge ([Image_model, Language_model], mode= ' concat ', concat_axis=-1)) # Let's encode this vector sequence to a single vector model.add (GRU, return_sequ Ences=false) # which'll be used to compute a probability # distribution over what the next word in the caption should b
E! Model.add (Dense (vocab_size)) Model.add (Activation (' Softmax ')) model.compile (loss= ' categorical_crossentropy ',
Optimizer= ' Rmsprop ') # "Images" is a numpy float array of shape (Nb_samples, nb_channels=3, width, height). # "Captions" is a numpy integer array of shape (Nb_samples, Max_caption_len) # containing word index sequences representin
G Partial captions.  # "Next_words" is a numpy float array of shape (Nb_samples, vocab_size) # containing a categorical encoding (0s and 1s) of
The next word in the corresponding # partial caption. Model.fit ([Images, partial_captions], next_words, batch_size=16, nb_epoch=100)


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.