Embedding layer
Keras.layers.embeddings.Embedding (Input_dim, Output_dim, embeddings_initializer= ' uniform ', embeddings_regularizer =none, Activity_regularizer=none, Embeddings_constraint=none, Mask_zero=false, Input_length=none)
Input_dim: Large or equal to 0 integer, dictionary length, i.e. input data max subscript +1
Output_dim: An integer greater than 0 that represents the fully connected embedded dimension input shape
Shape (samples,sequence_length) 2D tensor output shape
3D tensor of the form (samples, sequence_length, Output_dim)
Model = sequential ()
Model.add (Embedding (1000, output_dim=64, input_length=10)) # input_dim=1000 is the dictionary length
# The model would take as input an integer matrix of size (batch, input_length).
# the largest integer (i.e. word index) in the input should is no larger than (999 size).
# Model.input_shape = (none,10), where None is the batch dimension, "is Input_length" (time_step), Input_dim is 1.
# Model.output_shape = (none, ten,), where is the batch dimension, and is Output_length, and is Output_dim.
# samples, time steps, the input is a number between 0 and 1000 in 2-d array Input_array.
Input_array = Np.random.randint (1000, size=)
model.compile (' Rmsprop ', ' MSE ')
Output_array = Model.predict (Input_array)
assert output_array.shape = = (32, 10, 64)
Reference
http://keras-cn.readthedocs.io/en/latest/layers/embedding_layer/