Matconvnet in Mnist source code analysis

Source: Internet
Author: User

The code for this article comes from matconvnet
Here's your own comment on the code:
Cnn_mnist_init.m

functionNET = Cnn_mnist_init (varargin)% Cnn_mnist_lenet Initialize a CNN Similar forMnistopts.usebatchnorm = true;#batchNorm是否使用Opts.networktype =' Simplenn ';#网络结构使用lenet结构opts = Vl_argparse (opts, varargin); RNG (' Default '); RNG (0); f=1/ -; net.layers = {};# define each layer parameter, type is the layer attribute of the network, Stride is the step, pad is filled# Max is pooled in methodNet.layers{end+1} = struct (' type ',' Conv ',...                           ' weights ', {{F*RANDN (5,5,1, -,' single '), Zeros (1, -,' single ')}},...                           ' Stride ',1,...                           ' pad ',0); net.layers{end+1} = struct (' type ',' Pool ',...                           ' method ',' Max ',...                           ' Pool ', [2 2],...                           ' Stride ',2,...                           ' pad ',0); net.layers{end+1} = struct (' type ',' Conv ',...                           ' weights ', {{F*RANDN (5,5, -, -,' single '), Zeros (1, -,' single ')}},...                           ' Stride ',1,...                           ' pad ',0); net.layers{end+1} = struct (' type ',' Pool ',...                           ' method ',' Max ',...                           ' Pool ', [2 2],...                           ' Stride ',2,...                           ' pad ',0); net.layers{end+1} = struct (' type ',' Conv ',...                           ' weights ', {{F*RANDN (4,4, -, -,' single '), Zeros (1, -,' single ')}},...                           ' Stride ',1,...                           ' pad ',0); net.layers{end+1} = struct (' type ',' Relu '); net.layers{end+1} = struct (' type ',' Conv ',...                           ' weights ', {{F*RANDN (1,1, -,Ten,' single '), Zeros (1,Ten,' single ')}},...                           ' Stride ',1,...                           ' pad ',0); net.layers{end+1} = struct (' type ',' Softmaxloss ') ;# Optionally switch to batch normalization# BN layer is generally used in convolution to the pooling process, before the activation function, here is the 1,4,7 layer insert bnifOpts.usebatchnorm net = insertbnorm (NET,1) ; NET = Insertbnorm (NET,4) ; NET = Insertbnorm (NET,7); end# Meta ParametersNet.meta.inputSize = [ -  - 1] ;#输入大小 [W,h,channel], this is a grayscale image with a single channel of 1Net.meta.trainOpts.learningRate =0.001;#学习率Net.meta.trainOpts.numEpochs = -;#epoch次数, note that this is not the so-called iteration count.Net.meta.trainOpts.batchSize = -;#批处理, this is the effect of mini-batchsize,batchsize size on the training process see my other blog: One of the four major problems of convolutional neural networks# Fill in Defaul valuesNET = Vl_simplenn_tidy (NET);# Switch to DAGNN if requested# Choose a different network structure, the SIMPLENN structure used hereSwitchLower (Opts.networktype) case' Simplenn '% done case' DAGNN 'NET = Dagnn. Dagnn.fromsimplenn (NET,' Canonicalnames ', true); Net.addlayer (' ERROR ', Dagnn. Loss (' loss ',' ClassError '),...{' prediction ',' label '},' ERROR ') ; Otherwise assert (false); end%--------------------------------------------------------------------functionNET = Insertbnorm (NET, L)#具体的BN函数%--------------------------------------------------------------------assert (Isfield (Net.layers{l},' weights ')); Ndim = Size (net.layers{l}.weights{1},4); layer = struct (' type ',' Bnorm ',...               ' weights ', {{ones (Ndim,1,' single '), Zeros (Ndim,1,' single ')}},...               ' Learningrate ', [1 1 0.05],...               ' Weightdecay ', [0 0]); net.layers{l}.biases = []; net.layers = Horzcat (Net.layers (1: l), layer, net.layers (l +1: End));#horzcat水平方向矩阵连接, this is to reconstruct the network structure and insert the BN layer into the lennt

Cnn_mnist_experiments.m

Percent experiment with the cnn_mnist_fc_bnorm[net_bn, info_bn] = Cnn_mnist (...  ' Expdir ',' Data/mnist-bnorm ',' Usebnorm ', true); [Net_fc, Info_fc] = Cnn_mnist (...  ' Expdir ',' Data/mnist-baseline ',' Usebnorm ', false);# Here's the code for drawingFigure1) ; CLF; subplot (1,2,1) ;# The first pictureSemilogy (info_fc.val.objective', 'O'); Semilogy (info_bn.val.objective '),' +--') ;#表示y坐标轴是对数坐标系Xlabel (' Training samples [x 10^3] '); Ylabel (' energy '); grid on;#加入网格H=legend (' Bsln ',' Bnorm ') ;#加入标注Set (H,' Color ',' None '); Title (' objective '); subplot (1,2,2);p Lot (info_fc.val.error', 'O'); Hold all;p lot (Info_bn.val.error ',' +--'); H=legend (' Bsln-val ',' bsln-val-5 ',' Bnorm-val ',' bnorm-val-5 '); grid on; Xlabel (' Training samples [x 10^3] '); Ylabel (' ERROR '); Set (H,' Color ',' None '); Title (' ERROR ');d Rawnow;

Graph of results obtained from running:
    

Matconvnet in Mnist source code analysis

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.