Wrapper wrappertimedistributed Packaging Device
keras.layers.wrappers.TimeDistributed(layer)
The wrapper can apply a layer to each time step of the input
Parameters
Entering a dimension of at least 3D and subscript 1 will be considered a time dimension
For example, consider a batch with 32 samples, each of which is a sequence of 10 vectors, each with a length of 16, the input dimension is (32,10,16)
, it does not contain batch size input_shape
for(10,16)
We can use the wrapper TimeDistributed
wrapper Dense
to produce a separate full connection for each time step signal:
# as the first layer in a modelmodel = Sequential()model.add(TimeDistributed(Dense(8), input_shape=(10, 16)))# now model.output_shape == (None, 10, 8)# subsequent layers: no need for input_shapemodel.add(TimeDistributed(Dense(32)))# now model.output_shape == (None, 10, 32)
The output data of the program is shape(32,10,8)
TimeDistributed
the use Dense
of packaging is strictly equivalent to layers.TimeDistribuedDense
. The difference is that the wrapper TimeDistribued
can also be packaged on other layers, such as the Convolution2D
packaging:
model = Sequential()model.add(TimeDistributed(Convolution2D(64, 3, 3), input_shape=(10, 3, 299, 299)))
Bidirectional packaging Device
keras.layers.wrappers.Bidirectional(layer, merge_mode=‘concat‘, weights=None)
Bidirectional RNN Wrapper
Parameters
- Layer:
Recurrent
Object
- Merge_mode: The combination of forward and back rnn outputs, for,,
sum
, mul
concat
ave
and None
one, if set to None, the return value is not combined, but is returned as a list
Example
model = Sequential()model.add(Bidirectional(LSTM(10, return_sequences=True), input_shape=(5, 10)))model.add(Bidirectional(LSTM(10)))model.add(Dense(5))model.add(Activation(‘softmax‘))model.compile(loss=‘categorical_crossentropy‘, optimizer=‘rmsprop‘)
Albert (http://www.aibbt.com/) The first artificial intelligence portal in China
Keras official Chinese document: Wrapper wrapper