On the Imagenet Image Classification Challenge, Alex proposed the Alexnet network structure model won the 2012-term championship. In order to study the application of the CNN type DL network model in image classification, we can not escape the research alexnet, which is the classic model of CNN in image classification (after the DL fires up).
In the model example of the DL open source implementation Caffe, it also gives the alexnet, detailed network configuration files such as the following HTTPS://GITHUB.COM/BVLC/CAFFE/BLOB/MASTER/MODELS/BVLC_ Reference_caffenet/train_val.prototxt:
next This article will step by step to the network configuration structure of the various layers of the specific interpretation (training phase):
1. CONV1 phase DFD (Data flow diagram):
2. Conv2 phase DFD (Data flow diagram):
3. Conv3 phase DFD (Data flow diagram):
4. CONV4 phase DFD (Data flow diagram):
watermark/2/text/ahr0cdovl2jsb2cuy3nkbi5uzxqvc3vuymfpz3vp/font/5a6l5l2t/fontsize/400/fill/i0jbqkfcma==/ Dissolve/70/gravity/southeast ">
5. CONV5 phase DFD (Data flow diagram):
6. Fc6 phase DFD (Data flow diagram):
7. FC7 phase DFD (Data flow diagram):
8. Fc8 phase DFD (Data flow diagram):
Various layers of operation many other explanations can be tested http://caffe.berkeleyvision.org/tutorial/layers.html
From the process of calculating the data flow of the model. The model parameters are probably 5kw+.
The Caffe output also includes a log of the contents of this block, details such as the following:
I0721 10:38:15.326920 4692 net.cpp:125] Top shape:256 3 227 227 (39574272) I0721 10:38:15.326971 4692 net.cpp:125] Top s hape:256 1 1 1 (I0721) 10:38:15.326982 4692 net.cpp:156] data does not need backward computation. I0721 10:38:15.327003 4692 net.cpp:74] Creating Layer conv1i0721 10:38:15.327011 4692 net.cpp:84] conv1 <-dataI0721 10:38:15.327033 4692 net.cpp:110] conv1, conv1i0721 10:38:16.721956 4692 net.cpp:125] Top shape:256 96 55 55 (7434 2400) I0721 10:38:16.722030 4692 net.cpp:151] conv1 needs backward computation. I0721 10:38:16.722059 4692 net.cpp:74] Creating Layer relu1i0721 10:38:16.722070 4692 net.cpp:84] RELU1 <-conv1i0721 10:38:16.722082 4692 net.cpp:98] relu1, CONV1 (in-place) I0721 10:38:16.722096 4692 net.cpp:125] Top shape:256 96 (74342400) I0721 10:38:16.722105 4692 net.cpp:151] relu1 needs backward computation. I0721 10:38:16.722116 4692 net.cpp:74] Creating Layer pool1i0721 10:38:16.722125 4692 net.cpp:84] pool1 <-conv1i0721 10:38:16.722133 4692 net.cpp:110] pool1, pool1i0721 10:38:16.722167 4692 net.cpp:125] Top shape:256 96 27 27 ( 17915904) I0721 10:38:16.722187 4692 net.cpp:151] pool1 needs backward computation. I0721 10:38:16.722205 4692 net.cpp:74] Creating Layer norm1i0721 10:38:16.722221 4692 net.cpp:84] Norm1 <-pool1i0721 10:38:16.722234 4692 net.cpp:110] norm1, norm1i0721 10:38:16.722251 4692 net.cpp:125] Top shape:256 96 27 27 (179 15904) I0721 10:38:16.722260 4692 net.cpp:151] norm1 needs backward computation. I0721 10:38:16.722272 4692 net.cpp:74] Creating Layer conv2i0721 10:38:16.722280 4692 net.cpp:84] conv2 <-norm1i0721 10:38:16.722290 4692 net.cpp:110] conv2, conv2i0721 10:38:16.725225 4692 net.cpp:125] Top shape:256 256 27 27 (47 775744) I0721 10:38:16.725242 4692 net.cpp:151] conv2 needs backward computation. I0721 10:38:16.725253 4692 net.cpp:74] Creating Layer relu2i0721 10:38:16.725261 4692 net.cpp:84] RELU2 <-conv2i0721 10:38:16.725270 4692 NET.CPP:98] Relu2, Conv2 (in-place) I0721 10:38:16.725280 4692 net.cpp:125] Top shape:256 (47775744) I0721 1 0:38:16.725288 4692 net.cpp:151] relu2 needs backward computation. I0721 10:38:16.725298 4692 net.cpp:74] Creating Layer pool2i0721 10:38:16.725307 4692 net.cpp:84] pool2 <-conv2i0721 10:38:16.725317 4692 net.cpp:110] pool2, pool2i0721 10:38:16.725329 4692 net.cpp:125] Top shape:256 256 13 13 (11 075584) I0721 10:38:16.725338 4692 net.cpp:151] pool2 needs backward computation. I0721 10:38:16.725358 4692 net.cpp:74] Creating Layer norm2i0721 10:38:16.725368 4692 net.cpp:84] Norm2 <-pool2i0721 10:38:16.725378 4692 net.cpp:110] norm2, norm2i0721 10:38:16.725389 4692 net.cpp:125] Top shape:256 256 13 13 (11 075584) I0721 10:38:16.725399 4692 net.cpp:151] norm2 needs backward computation. I0721 10:38:16.725409 4692 net.cpp:74] Creating Layer conv3i0721 10:38:16.725419 4692 net.cpp:84] conv3 <-norm2i0721 10:38:16.725427 4692 net.cpp:110] ConV3, conv3i0721 10:38:16.735193 4692 net.cpp:125] Top shape:256 384 (16613376) I0721 10:38:16.735213 4692 net. CPP:151] conv3 needs backward computation. I0721 10:38:16.735224 4692 net.cpp:74] Creating Layer relu3i0721 10:38:16.735234 4692 net.cpp:84] RELU3 <-conv3i0721 10:38:16.735242 4692 net.cpp:98] relu3, Conv3 (in-place) I0721 10:38:16.735250 4692 net.cpp:125] Top shape:256 384 (16613376) I0721 10:38:16.735258 4692 net.cpp:151] RELU3 needs backward computation. I0721 10:38:16.735302 4692 net.cpp:74] Creating Layer conv4i0721 10:38:16.735312 4692 net.cpp:84] conv4 <-conv3i0721 10:38:16.735321 4692 net.cpp:110] conv4, conv4i0721 10:38:16.743952 4692 net.cpp:125] Top shape:256 384 13 13 (16 613376) I0721 10:38:16.743988 4692 net.cpp:151] conv4 needs backward computation. I0721 10:38:16.744000 4692 net.cpp:74] Creating Layer relu4i0721 10:38:16.744010 4692 net.cpp:84] Relu4 <-conv4i0721 10:38:16.744020 4692 net.cpp:98] Relu4, CONV4 (In-place) I0721 10:38:16.744030 4692 net.cpp:125] Top shape:256 384 (16613376) I0721 10:38:16.744038 4692 net.cpp:1 Wuyi] Relu4 needs backward computation. I0721 10:38:16.744050 4692 net.cpp:74] Creating Layer conv5i0721 10:38:16.744057 4692 net.cpp:84] conv5 <-conv4i0721 10:38:16.744067 4692 net.cpp:110] conv5, conv5i0721 10:38:16.748935 4692 net.cpp:125] Top shape:256 256 13 13 (11 075584) I0721 10:38:16.748955 4692 net.cpp:151] conv5 needs backward computation. I0721 10:38:16.748965 4692 net.cpp:74] Creating Layer relu5i0721 10:38:16.748975 4692 net.cpp:84] relu5 <-conv5i0721 10:38:16.748983 4692 net.cpp:98] relu5, CONV5 (in-place) I0721 10:38:16.748998 4692 net.cpp:125] Top shape:256 256 (11075584) I0721 10:38:16.749011 4692 net.cpp:151] relu5 needs backward computation. I0721 10:38:16.749022 4692 net.cpp:74] Creating Layer pool5i0721 10:38:16.749030 4692 net.cpp:84] pool5 <-conv5i0721 10:38:16.749039 4692 net.cpp:110] Pool5, pool5i072110:38:16.749050 4692 net.cpp:125] Top shape:256 6 6 (2359296) I0721 10:38:16.749058 4692 net.cpp:151] pool5 needs BA Ckward computation. I0721 10:38:16.749074 4692 net.cpp:74] Creating Layer fc6i0721 10:38:16.749083 4692 net.cpp:84] fc6 <-pool5i0721 10: 38:16.749091 4692 net.cpp:110] fc6, fc6i0721 10:38:17.160079 4692 net.cpp:125] Top shape:256 4096 1 1 (1048576) I07 10:38:17.160148 4692 net.cpp:151] fc6 needs backward computation. I0721 10:38:17.160166 4692 net.cpp:74] Creating Layer relu6i0721 10:38:17.160177 4692 net.cpp:84] Relu6 <-fc6i0721 1 0:38:17.160190 4692 net.cpp:98] relu6, Fc6 (in-place) I0721 10:38:17.160202 4692 net.cpp:125] Top shape:256 4096 1 1 (1048576) I0721 10:38:17.160212 4692 net.cpp:151] relu6 needs backward computation. I0721 10:38:17.160222 4692 net.cpp:74] Creating Layer drop6i0721 10:38:17.160230 4692 net.cpp:84] DROP6 <-fc6i0721 1 0:38:17.160238 4692 net.cpp:98] drop6, Fc6 (in-place) I0721 10:38:17.160258 4692 net.cpp:Top shape:256 4096 1 1 (1048576) I0721 10:38:17.160265 4692 net.cpp:151] drop6 needs backward computation. I0721 10:38:17.160277 4692 net.cpp:74] Creating Layer fc7i0721 10:38:17.160286 4692 net.cpp:84] fc7 <-fc6i0721 10:38 : 17.160295 4692 net.cpp:110] fc7, fc7i0721 10:38:17.342094 4692 net.cpp:125] Top shape:256 4096 1 1 (1048576) I0721 10:38:17.342157 4692 net.cpp:151] fc7 needs backward computation. I0721 10:38:17.342175 4692 net.cpp:74] Creating Layer relu7i0721 10:38:17.342185 4692 net.cpp:84] relu7 <-fc7i0721 1 0:38:17.342198 4692 net.cpp:98] relu7, Fc7 (in-place) I0721 10:38:17.342208 4692 net.cpp:125] Top shape:256 4096 1 1 (1048576) I0721 10:38:17.342217 4692 net.cpp:151] relu7 needs backward computation. I0721 10:38:17.342228 4692 net.cpp:74] Creating Layer drop7i0721 10:38:17.342236 4692 net.cpp:84] DROP7 <-fc7i0721 1 0:38:17.342245 4692 net.cpp:98] drop7, Fc7 (in-place) I0721 10:38:17.342254 4692 net.cpp:125] Top shape:256 4096 1 1 (1048576) I0721 10:38:17.342262 4692 net.cpp:151] DROP7 needs backward computation. I0721 10:38:17.342274 4692 net.cpp:74] Creating Layer fc8i0721 10:38:17.342283 4692 net.cpp:84] Fc8 <-fc7i0721 10:38 : 17.342291 4692 net.cpp:110] fc8, fc8i0721 10:38:17.343199 4692 net.cpp:125] Top shape:256 1 1 (5632) I0721 10:3 8:17.343214 4692 net.cpp:151] fc8 needs backward computation. I0721 10:38:17.343231 4692 net.cpp:74] Creating Layer lossI0721 10:38:17.343240 4692 net.cpp:84] loss <-fc8i0721 10: 38:17.343250 4692 net.cpp:84] loss <-labelI0721 10:38:17.343264 4692 net.cpp:151] loss needs backward computation. I0721 10:38:17.343305 4692 net.cpp:173] Collecting learning rate and Weight decay.i0721 10:38:17.343327 4692 net.cpp:166 ] Network initialization done. I0721 10:38:17.343335 4692 net.cpp:167] Memory required for Data 1073760256
[Caffe] alexnet interpretation of the image classification model of deep learning