MXNET: Deep Learning calculation-Custom layer

Source: Internet
Author: User
Tags mxnet

Although Gluon provides a large number of commonly used layers, sometimes we still want to customize the layer. This section describes how to use Ndarray to customize a layer of gluon so that it can be called again later.

Custom layers with no model parameters

Let's first describe how to define a custom layer that does not contain model parameters. In fact, this is similar to using the Block construction model as described in "model construction".

By inheriting the Block, you customize a layer that will subtract the mean from the input: The Centeredlayer class, and place the calculation of the layer in the forward function.

class CenteredLayer(nn.Block):    def __init__(self, **kwargs):        super(CenteredLayer, self).__init__(**kwargs)    def forward(self, x):        return x - x.mean()layer = CenteredLayer()layer(nd.array([1, 2, 3, 4, 5]))# output[-2. -1.  0.  1.  2.]<NDArray 5 @cpu(0)>

We can also use it to construct more complex models.

net = nn.Sequential()net.add(nn.Dense(128))net.add(nn.Dense(10))net.add(CenteredLayer())net.initialize()y = net(nd.random.uniform(shape=(4, 8)))y.mean()
Custom layers with model parameters

In the custom layer we can also use the Block's own parameterdict type member variable params. As the name implies, this is a dictionary of model parameters that are mapped to the Parameter type by the parameter name of the string type. We can create Parameter from parameterdict through the Get function.

params = gluon.ParameterDict()params.get('param2', shape=(2, 3))params# ouput(  Parameter param2 (shape=(2, 3), dtype=<class 'numpy.float32'>))

Now let's look at how to implement a full-join layer with weight parameters and deviation parameters. It uses ReLU as the activation function. The In_units and units are the number of input units and the number of output units respectively.

class MyDense(nn.Block):    def __init__(self, units, in_units, **kwargs):        super(MyDense, self).__init__(**kwargs)        self.weight = self.params.get('weight', shape=(in_units, units))        self.bias = self.params.get('bias', shape=(units,))    def forward(self, x):        linear = nd.dot(x, self.weight.data()) + self.bias.data()        return nd.relu(linear)

We instantiate the Mydense class to see its model parameters.

# units:该层的输出个数;in_units:该层的输入个数。dense = MyDense(units=5, in_units=10)dense.params# outputmydense0_ (  Parameter mydense0_weight (shape=(10, 5), dtype=<class 'numpy.float32'>)  Parameter mydense0_bias (shape=(5,), dtype=<class 'numpy.float32'>))

We can also use a custom layer to construct the model. It is very similar to other layers of gluon.

net = nn.Sequential()net.add(MyDense(32, in_units=64))net.add(MyDense(2, in_units=32))net.initialize()net(nd.random.uniform(shape=(2, 64)))

MXNET: Deep learning calculation-Custom layer

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.