C ++ convolutional neural network example: tiny_cnn code explanation (9) -- partial_connected_layer Structure Analysis (bottom)

Source: Internet
Author: User

C ++ convolutional neural network example: tiny_cnn code explanation (9) -- partial_connected_layer Structure Analysis (bottom)

In the previous blog, we focused on analyzing the structure of the member variables of the partial_connected_layer class. In this blog, we will continue to give a brief introduction to other member functions in the partial_connected_layer class.

1. Constructor

Since the partial_connected_layer class inherits from the base class layer, the constructor can also be divided into two parts: calling the base class constructor and initializing its own member variables:

partial_connected_layer(layer_size_t in_dim, layer_size_t out_dim, size_t weight_dim, size_t bias_dim, float_t scale_factor = 1.0)    : layer (in_dim, out_dim, weight_dim, bias_dim),      weight2io_(weight_dim), out2wi_(out_dim), in2wo_(in_dim), bias2out_(bias_dim), out2bias_(out_dim), scale_factor_(scale_factor) {}

Here, the assignment of member variables is directly copied. weight2io _ indicates the total number of ing cores in the ing matrix in the network, (the square of the convolution kernel size * number of channels * Number of convolution kernels); out2wi _ indicates the dimension of the output features of the convolution layer network, which is (in_width-window_size + 1) * (in_height-window_size + 1) * out_channels; in2wo _ indicates the dimension of the convolution layer input, which is the number of rows in the data matrix * Number of columns * number of channels; bias2out _ indicates the total number of weighted offsets in the convolution layer; out2bias _ indicates the dimension of the output feature.

Here, a macro is defined for the base class layer of partial_connected_layer for later use:

2. layer attribute calculation parameters

Because the convolution layer and the lower sampling layer have a large number of parameters, in order to help you understand the number of parameters and the connection size of each layer, here we provide three parameter number calculation functions, returns the number of parameters in the lower sampling layer of the convolution layer. First, return the number of parameters to be learned at the current layer (including the convolution kernel weight and offset ):

Returns the number of connections between the current layer and the previous layer:

Returns the feature output dimension of the current layer:

3. Forward propagation function forward_propagation

Because the partial_connected_layer class is a common base class for the convolution layer and the lower sampling layer, both the convolution layer and the lower sampling layer also need the Forward Propagation and reverse propagation functions, therefore, the author chooses to define the Forward Propagation algorithm and reverse Propagation Algorithm in the partial_connected_layer class, instead of defining them in the two subclasses. As for the reason, detailed descriptions will be provided in the following blog posts.

The encapsulated Forward Propagation algorithm is similar to the Forward Propagation Algorithm in the full connection layer described above. It mainly consists of three parts: the convolution process of Forward propagation, the output convolution structure, and recursion.

The first choice is to perform convolution, corresponding coefficient expansion in the convolution process (here the coefficient is 1 by default, so this step can be ignored), and add a bias value, these operations are implemented in Lamda expressions:

Next we need to pass the convolution result to the output array. here we need to first activate the function for further processing:

Finally, the algorithm is propagated by recursion (both forward and reverse propagation are done by recursion ):

As for the back_propagation function, due to its complicated structure, we will describe the reverse propagation function in the blog post about the BP algorithm, the first 1/3 of the length of this series of blogs is to analyze the Forward Propagation Process of tiny_cnn. Therefore, the back_propagation () function is used to dig a big pitfall for the time being and wait for subsequent blog posts to fill in, sorry.

OK. This blog will introduce you here first. Now we have introduced most of the member variables and functional functions (except the reverse Propagation Algorithm) of the partial_connected_layer class, in the next blog, we will continue to analyze the underlying base classes: layer and layer_base. Then we will start to study the implementation of Forward Propagation of convolution networks.

Iii. Notes

1. Questions about the function call format and construction process

Here, we need to emphasize that tiny_cnn adopts a form similar to "stream" to construct the entire network structure:

This stream operation method is defined in the network Class. For more information about this stream construction technique, I will introduce the network class in detail (and it is a pitfall ), currently, it is known that this stream operation mode is a one-time construction of the entire network structure on the surface, which also brings us trouble to view the network initialization parameters of the intermediate layer during program debugging, let's see how to solve the problem later.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.