[Keras] writes a custom network layer (layer) using Keras _deeplearning

Source: Internet
Author: User
Tags keras

Keras provides many common, prepared layer objects, such as the common convolution layer, the pool layer, and so on, which we can call directly through the following code:

# Call a conv2d layer
from Keras import layers
conv2d = Keras.layers.convolutional.Conv2D (filters,\ kernel_size
, \
strides= (1, 1), \
padding= ' valid ', \
...)

However, in practical applications, we often need to build some layer objects ourselves to meet the special needs of some custom networks.
Fortunately, Keras provides good support for the custom layer.

The following is a summary of the common methods. Method 1:keras.core.lambda ()

If our custom layer does not contain a trained weight, but only a functional transformation of the output from the previous layer, we can use the Keras.core module directly (the module contains a lambda function in the common base layer, such as dense, activation, and so on):

Keras.layers.core.Lambda (function, Output_shape=none, Mask=none, Arguments=none)

Parameter description:
Functions: The function to implement, which accepts only one variable, that is, the output from the previous layer
Output_shape: The shape of the value that the function should return, can be a tuple, or it can be a function of the output shape based on the input shape
Mask: Masking film
Arguments: optional, dictionary, used to record other key parameters passed to the function

But most of the time, what we need to define is a new layer with a training weight, and we need to use the following method. Method 2: Write the Layer inheritance class

The keras.engine.topology contains the parent class of the layer, and we can implement our own layer through inheritance.
To customize your own layer, you need to implement the following three methods

Build (Input_shape): This is the way to define weights, and the right to be trained should be added to the list self.trainable_weights here. Other properties include self.non_trainabe_weights (list) and self.updates (a list of tuple that need to be updated, such as (Tensor,new_tensor). This method must set self.built = True, which can be implemented by calling super ([layer],self). Build ().

Call (x): This is a way to define layer functionality, unless you want your written layer to support masking, otherwise you only need to care about the first parameter of call: the input tensor.

Compute_output_shape (input_shape): If your layer modifies the shape of the input data, you should specify the method of shape change here, which allows Keras to make automatic shape inference.

A better way to learn is to read the source code of the class that Keras has written, and try to understand the logic.

Next, we'll write a custom layer with a practical example.
For learning purposes, in this example, some annotation text is added to explain some function functions.

This layer structure originates from densenet, code reference GitHub.

From Keras.layers.core import Layer to Keras.engine import inputspec from Keras import backend as K Try:from Keras Import initializations except Importerror:from Keras import initializers as Initializations # Inherit parent class Layer class Scale (L
        Ayer): ' This layer function: Adjusts the shape of the upper output by multiplying the vector elements (element wise multiplication). out = in * gamma + beta, gamma represents the weighted Weights,beta for the bias bias parameter list: Axis:int type, representing the axis direction to be done scale, axis=-1 to select the default party
        to (rampage).
        Momentum: The momentum for exponential averaging of data variance and standard deviation. Weights: initial weight, is a list containing two numpy array, shapes:[(Input_shape,), (Input_shape,)] Beta_init: The initialization method name of the offset. (refer to Keras.initializers.) Use the. Gamma_init: The initialization method name for the weight quantity only when weights is not passed. (refer to Keras.initializers.) Use only if weights is not referenced. ' Def __init__ (self, weights=none, axis=-1, beta_init = ' zero ', gamma_i  NIT = ' one ', momentum = 0.9, **kwargs): # parameter **kwargs represents inheriting the parent class by Dictionary self.momentum = Momentum Self.axis = Axis Self.beta_init = initializers. Zeros () selF.gamma_init = initializers. Ones () Self.initial_weights = Weights Super (Scale, self). __init__ (**kwargs) def build (self, Input_sha PE): Self.input_spec = [Inputspec (shape=input_shape)] # 1:inputspec (Dtype=none, Shape=none, Ndim=none, Max  _ndim=none, Min_ndim=none, Axes=none) #Docstring: #Specifies the Ndim, dtype and shape of every input
        to a layer. #Every layer should expose (if appropriate) an ' input_spec ' attribute:a list of instances to Inputspec (one per input tens
        OR).

        #A None entry in A shape was compatible with any dimension #A the None shape is compatible with any shape. # 2:self.input_spec:list of Inputspec class instances # Each entry describes one required input: #-

        Ndim #-Dtype # A layer with ' n ' input tensors must have # A ' input_spec ' of length ' n '. shape = (int (Input_shape[self.axis]),) # Compatibility with TensorFlow >= 1.0.0 self.gamma = k.variable (Self.gamma_init (Shape), name= ' {}_gamma '. Format (self.name)) Self.beta = K.variable (Self.beta_init (Shape), name= ' {}_beta '. Format (self.name)) self.trainable_weights = [Self.gamma, self.be TA] If self.initial_weights is not None:self.set_weights (self.initial_weights) del self. Initial_weights def call (self, x, mask=none): Input_shape = Self.input_spec[0].shape Broadcast_shape = [1] * len (input_shape) Broadcast_shape[self.axis] = Input_shape[self.axis] out = K.reshape (Self.gamma, Broadcast_shape) * x + k.reshape (Self.beta, Broadcast_shape) return out Def get_config (self): config = {"Momentum": Self.momentum, "axis": self.axis} base_config = Super (Scale, self). Get_config () return dict (List (Base_config.items ()) + list (Config.items ()))

The above is to write a custom layer of instances, you can add directly to their model.
The layer is automatically stored in Custom_layers and invoked via import.

From custom_layers import Scale

def mynet (growth_rate=32,
nb_filter=64, \
reduction=0.0, \
dropout_ rate=0.0, weight_decay=1e-4,...)

...
x = "Last_layer_name"
x = Scale (Axis=concat_axis, name= ' Scale ') (x) ...

Model = Model (input, x, name= ' mynet ') return
model

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.