TensorFlow actual Combat series 12--Pool layer network structure __tensorflow

Source: Internet
Author: User

In convolution neural networks, a pool layer (poolinglayer) is often added between the convolution layers. The pool layer can reduce the size of the matrix very effectively, thereby reducing the parameters in the last fully connected layer. The use of the pool layer can both speed up the calculation speed and prevent the problem of fitting. Similar to the convolution layer, the process of forward propagation of the pool layer is accomplished by moving a filter-like structure. However, the calculation in the pool layer filter is not a weighted sum of nodes, but rather a simpler maximum or average operation. The pool layer with the maximum operation is called the maximum pool layer (max pooling), which is the most commonly used pool layer structure. The pooled layer using the average operation is called the average pool layer (average pooling).

Similar to the filter in the convolution layer, the filter in the pool layer also needs to manually set the size of the filter, whether to use the full 0 padding and the size of the filter moved, and the meaning of these settings is the same. The way the filters in the convolution layer and the pool layer move is similar, the only difference is that the filter used in the convolution layer spans the entire depth, and the filters used by the pool layer only affect the nodes on one depth. Therefore, the filter of the pool layer, in addition to moving in the length and width of the two dimensions, also needs to move in the depth of this dimension. The following TensorFlow program implements the forward propagation algorithm of the maximum pool layer.

# Tf.nn. Max_pool implements the forward propagation process of the maximum pool layer, and its parameters are similar to the tf.nn.conv2d function.
# KSIZE provides the size of the filter, strides provides the step information, padding provides the use of full 0 padding.
pool = Tf.nn.max_pool (Actived_conv, Ksize=[1, 3, 3, 1],
strides=[1, 2, 2, 1], padding= ' SAME ')
Comparing the implementation of the tensorflow of the pool layer and the convolution layer forward propagation, we can find that the parameter form of the function is similar. In the Tf.nn.max_pool function, we first need to pass in the node matrix of the current layer, which is a four-dimensional matrix, which is consistent with the first parameter in the TF.NN.CONV2D function. The second parameter is the size of the filter. Although a one-dimensional array of length 4 is given, the first and last number of this array must be 1. This means that the filter of the pool layer is not available to cross different input samples or the node matrix depth. The maximum number of pooled layer filters used in practical applications is [1,2,2,1] or [1,3,3,1].

The third parameter of the Tf.nn.max_pool function is the step, which is the same as the step in the tf.nn.conv2d function, and the first and last dimension can only be 1. This means that in tensorflow, the pool layer cannot reduce the depth of the node matrix or the number of input samples. The last parameter of the Tf.nn.max_pool function specifies whether to use full 0 padding. This parameter also has only two values,--valid or SAME, where VALID means that no full 0 padding is used, and SAME represents a full 0 fill. TensorFlow also provides tf.nn.avg_pool to achieve an average pool layer. The calling format of the Tf.nn.avg_pool function is consistent with the Tf.nn.max_pool function.



Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.