Tiny_cnn reading (1), tiny_cnn reading

Source: Internet
Author: User

Tiny_cnn reading (1), tiny_cnn reading

From today on, I will submit my reading experience on tiny_cnn to the blog center. I hope you can have more exchanges on this platform;

What if I read the code? Focus on ignoring details

First open the file downloaded from github:

Through csdn and searching on the internet, you will know what the directories of this file are stored;

I use $ {root} to represent the path to tiny-cnn-master. This variable will be used in the annotation;

Start vs/2014, use MS studio to open this project, and find main. cpp.

Samplew.convnet () function is called, as shown in 1.

 

Figure 1 main method

While we did not enter any parameters during the operation, it can be seen that there are 45th rows of running. In this case, we need to find the function to call when there is no parameter?

Row 3 shows that the parameter value of the samplew.convnet () function is data_dir_path = ../data.

This variable has not yet been used, so we don't have to go further, as long as we know that. data_dir_path is the path of the data;

 

"Enter the void samplew.convnet (const string & data_dir_path) function:

First sentence:

 

So we have to understand these two generics:

Mse:

   

@see: ${root}/tiny_cnn/optimzers/optimizer.h

 

Gradient_descent_levenberg_marquardt:

   

@see: ${root}/tiny_cnn/optimzers/optimizer.h

The above two functions are useless, but we can check them because the code is concise.

Mse:

2-1:

Tiny_cnn supports two types of loss functions:

(1) mean squared error mean variance Function

(2) cross entropy

 

Read the code;

   

A) These are two mathematical formulas converted:

   

  f(y,t) =  (y  - t )^2 / 2 

   

Df (y, t) = y-t. In theory, y and t are all real numbers from negative infinity to positive infinity.

B) The program uses the self-generated mse loss function. If we want to modify or make improvements, we can also start from here;

For example, changing a loss function, improving this function, etc ..

Figure 2-1: mse function implementation

Gradient_descent_levenberg_marquardt:

 

Network

   

@see ${root}/tiny_cnn/network.h

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.