"Turn" [Caffe] alexnet interpretation of image classification model of deep learning

Source: Internet
Author: User
Tags knowledge base

[Caffe] alexnet interpretation of the image classification model of deep learningOriginal address: http://blog.csdn.net/sunbaigui/article/details/39938097This article has been included in: Deep learning Knowledge BaseClassification:Deep Learning (+)

Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.

On the Imagenet Image Classification Challenge, Alex proposed the Alexnet network structure model won the 2012-term championship. In order to study the application of the CNN type DL network model in image classification, we can not escape the research alexnet, which is the classic model of CNN in image classification (after the DL fires up).

In the DL open source Implementation Caffe Model sample, it also gives the alexnet of the recurrence, the specific network configuration file is as follows Https://github.com/BVLC/caffe/blob/master/models/bvlc_ Reference_caffenet/train_val.prototxt:

next This article will step by step to the network configuration structure of the various layers of the detailed interpretation (training phase):

1. CONV1 phase DFD (Data flow diagram):

2. Conv2 phase DFD (Data flow diagram):

3. Conv3 phase DFD (Data flow diagram):

4. CONV4 phase DFD (Data flow diagram):

5. CONV5 phase DFD (Data flow diagram):

6. Fc6 phase DFD (Data flow diagram):

7. FC7 phase DFD (Data flow diagram):

8. Fc8 phase DFD (Data flow diagram):

Various layers of operation more explanations can be referred to http://caffe.berkeleyvision.org/tutorial/layers.html

From the process of calculating the data flow of the model, the model parameters are probably 5kw+.

The contents of this block are also included in the output of the Caffe, as detailed below:

[CPP]View PlainCopyprint?
  1. I0721 10:38:15.326920 4692 net.cpp:125] Top shape:256 3 227 227 (39574272)
  2. I0721 10:38:15.326971 4692 net.cpp:125] Top shape:256 1 1 1 (256)
  3. I0721 10:38:15.326982 4692 net.cpp:156] data does not need backward computation.
  4. I0721 10:38:15.327003 4692 net.cpp:74] Creating Layer conv1
  5. I0721 10:38:15.327011 4692 net.cpp:84] conv1 <-data
  6. I0721 10:38:15.327033 4692 net.cpp:110] CONV1-CONV1
  7. I0721 10:38:16.721956 4692 net.cpp:125] Top shape:256 96 55 55 (74342400)
  8. I0721 10:38:16.722030 4692 net.cpp:151] conv1 needs backward computation.
  9. I0721 10:38:16.722059 4692 net.cpp:74] Creating Layer RELU1
  10. I0721 10:38:16.722070 4692 net.cpp:84] RELU1 <-conv1
  11. I0721 10:38:16.722082 4692 net.cpp:98] relu1-CONV1 (In-place)
  12. I0721 10:38:16.722096 4692 net.cpp:125] Top shape:256 96 55 55 (74342400)
  13. I0721 10:38:16.722105 4692 net.cpp:151] relu1 needs backward computation.
  14. I0721 10:38:16.722116 4692 net.cpp:74] Creating Layer pool1
  15. I0721 10:38:16.722125 4692 net.cpp:84] pool1 <-conv1
  16. I0721 10:38:16.722133 4692 net.cpp:110] Pool1-pool1
  17. I0721 10:38:16.722167 4692 net.cpp:125] Top shape:256 96 27 27 (17915904)
  18. I0721 10:38:16.722187 4692 net.cpp:151] pool1 needs backward computation.
  19. I0721 10:38:16.722205 4692 net.cpp:74] Creating Layer Norm1
  20. I0721 10:38:16.722221 4692 net.cpp:84] Norm1 <-pool1
  21. I0721 10:38:16.722234 4692 net.cpp:110] Norm1-Norm1
  22. I0721 10:38:16.722251 4692 net.cpp:125] Top shape:256 96 27 27 (17915904)
  23. I0721 10:38:16.722260 4692 net.cpp:151] norm1 needs backward computation.
  24. I0721 10:38:16.722272 4692 net.cpp:74] Creating Layer conv2
  25. I0721 10:38:16.722280 4692 net.cpp:84] conv2 <-Norm1
  26. I0721 10:38:16.722290 4692 net.cpp:110] Conv2-conv2
  27. I0721 10:38:16.725225 4692 net.cpp:125] Top shape:256 256 27 27 (47775744)
  28. I0721 10:38:16.725242 4692 net.cpp:151] conv2 needs backward computation.
  29. I0721 10:38:16.725253 4692 net.cpp:74] Creating Layer RELU2
  30. I0721 10:38:16.725261 4692 net.cpp:84] RELU2 <-conv2
  31. I0721 10:38:16.725270 4692 net.cpp:98] RELU2-conv2 (In-place)
  32. I0721 10:38:16.725280 4692 net.cpp:125] Top shape:256 256 27 27 (47775744)
  33. I0721 10:38:16.725288 4692 net.cpp:151] relu2 needs backward computation.
  34. I0721 10:38:16.725298 4692 net.cpp:74] Creating Layer pool2
  35. I0721 10:38:16.725307 4692 net.cpp:84] pool2 <-conv2
  36. I0721 10:38:16.725317 4692 net.cpp:110] pool2-pool2
  37. I0721 10:38:16.725329 4692 net.cpp:125] Top shape:256 256 13 13 (11075584)
  38. I0721 10:38:16.725338 4692 net.cpp:151] pool2 needs backward computation.
  39. I0721 10:38:16.725358 4692 net.cpp:74] Creating Layer norm2
  40. I0721 10:38:16.725368 4692 net.cpp:84] Norm2 <-pool2
  41. I0721 10:38:16.725378 4692 net.cpp:110] Norm2-Norm2
  42. I0721 10:38:16.725389 4692 net.cpp:125] Top shape:256 256 13 13 (11075584)
  43. I0721 10:38:16.725399 4692 net.cpp:151] norm2 needs backward computation.
  44. I0721 10:38:16.725409 4692 net.cpp:74] Creating Layer conv3
  45. I0721 10:38:16.725419 4692 net.cpp:84] conv3 <-norm2
  46. I0721 10:38:16.725427 4692 net.cpp:110] Conv3-conv3
  47. I0721 10:38:16.735193 4692 net.cpp:125] Top shape:256 384 13 13 (16613376)
  48. I0721 10:38:16.735213 4692 net.cpp:151] conv3 needs backward computation.
  49. I0721 10:38:16.735224 4692 net.cpp:74] Creating Layer RELU3
  50. I0721 10:38:16.735234 4692 net.cpp:84] RELU3 <-conv3
  51. I0721 10:38:16.735242 4692 net.cpp:98] RELU3-conv3 (In-place)
  52. I0721 10:38:16.735250 4692 net.cpp:125] Top shape:256 384 13 13 (16613376)
  53. I0721 10:38:16.735258 4692 net.cpp:151] RELU3 needs backward computation.
  54. I0721 10:38:16.735302 4692 net.cpp:74] Creating Layer conv4
  55. I0721 10:38:16.735312 4692 net.cpp:84] conv4 <-conv3
  56. I0721 10:38:16.735321 4692 net.cpp:110] conv4-conv4
  57. I0721 10:38:16.743952 4692 net.cpp:125] Top shape:256 384 13 13 (16613376)
  58. I0721 10:38:16.743988 4692 net.cpp:151] conv4 needs backward computation.
  59. I0721 10:38:16.744000 4692 net.cpp:74] Creating Layer relu4
  60. I0721 10:38:16.744010 4692 net.cpp:84] Relu4 <-conv4
  61. I0721 10:38:16.744020 4692 net.cpp:98] Relu4-conv4 (In-place)
  62. I0721 10:38:16.744030 4692 net.cpp:125] Top shape:256 384 13 13 (16613376)
  63. I0721 10:38:16.744038 4692 net.cpp:151] relu4 needs backward computation.
  64. I0721 10:38:16.744050 4692 net.cpp:74] Creating Layer conv5
  65. I0721 10:38:16.744057 4692 net.cpp:84] conv5 <-conv4
  66. I0721 10:38:16.744067 4692 net.cpp:110] conv5-CONV5
  67. I0721 10:38:16.748935 4692 net.cpp:125] Top shape:256 256 13 13 (11075584)
  68. I0721 10:38:16.748955 4692 net.cpp:151] conv5 needs backward computation.
  69. I0721 10:38:16.748965 4692 net.cpp:74] Creating Layer relu5
  70. I0721 10:38:16.748975 4692 net.cpp:84] relu5 <-conv5
  71. I0721 10:38:16.748983 4692 net.cpp:98] relu5-CONV5 (In-place)
  72. I0721 10:38:16.748998 4692 net.cpp:125] Top shape:256 256 13 13 (11075584)
  73. I0721 10:38:16.749011 4692 net.cpp:151] relu5 needs backward computation.
  74. I0721 10:38:16.749022 4692 net.cpp:74] Creating Layer pool5
  75. I0721 10:38:16.749030 4692 net.cpp:84] pool5 <-conv5
  76. I0721 10:38:16.749039 4692 net.cpp:110] POOL5-POOL5
  77. I0721 10:38:16.749050 4692 net.cpp:125] Top shape:256 256 6 6 (2359296)
  78. I0721 10:38:16.749058 4692 net.cpp:151] pool5 needs backward computation.
  79. I0721 10:38:16.749074 4692 net.cpp:74] Creating Layer fc6
  80. I0721 10:38:16.749083 4692 net.cpp:84] fc6 <-pool5
  81. I0721 10:38:16.749091 4692 net.cpp:110] Fc6-fc6
  82. I0721 10:38:17.160079 4692 net.cpp:125] Top shape:256 4096 1 1 (1048576)
  83. I0721 10:38:17.160148 4692 net.cpp:151] fc6 needs backward computation.
  84. I0721 10:38:17.160166 4692 net.cpp:74] Creating Layer relu6
  85. I0721 10:38:17.160177 4692 net.cpp:84] Relu6 <-fc6
  86. I0721 10:38:17.160190 4692 net.cpp:98] Relu6-fc6 (In-place)
  87. I0721 10:38:17.160202 4692 net.cpp:125] Top shape:256 4096 1 1 (1048576)
  88. I0721 10:38:17.160212 4692 net.cpp:151] relu6 needs backward computation.
  89. I0721 10:38:17.160222 4692 net.cpp:74] Creating Layer DROP6
  90. I0721 10:38:17.160230 4692 net.cpp:84] DROP6 <-fc6
  91. I0721 10:38:17.160238 4692 net.cpp:98] DROP6-fc6 (In-place)
  92. I0721 10:38:17.160258 4692 net.cpp:125] Top shape:256 4096 1 1 (1048576)
  93. I0721 10:38:17.160265 4692 net.cpp:151] drop6 needs backward computation.
  94. I0721 10:38:17.160277 4692 net.cpp:74] Creating Layer fc7
  95. I0721 10:38:17.160286 4692 net.cpp:84] fc7 <-fc6
  96. I0721 10:38:17.160295 4692 net.cpp:110] FC7-FC7
  97. I0721 10:38:17.342094 4692 net.cpp:125] Top shape:256 4096 1 1 (1048576)
  98. I0721 10:38:17.342157 4692 net.cpp:151] fc7 needs backward computation.
  99. I0721 10:38:17.342175 4692 net.cpp:74] Creating Layer RELU7
  100. I0721 10:38:17.342185 4692 net.cpp:84] Relu7 <-fc7
  101. I0721 10:38:17.342198 4692 net.cpp:98] Relu7-FC7 (In-place)
  102. I0721 10:38:17.342208 4692 net.cpp:125] Top shape:256 4096 1 1 (1048576)
  103. I0721 10:38:17.342217 4692 net.cpp:151] relu7 needs backward computation.
  104. I0721 10:38:17.342228 4692 net.cpp:74] Creating Layer DROP7
  105. I0721 10:38:17.342236 4692 net.cpp:84] DROP7 <-fc7
  106. I0721 10:38:17.342245 4692 net.cpp:98] DROP7-FC7 (In-place)
  107. I0721 10:38:17.342254 4692 net.cpp:125] Top shape:256 4096 1 1 (1048576)
  108. I0721 10:38:17.342262 4692 net.cpp:151] DROP7 needs backward computation.
  109. I0721 10:38:17.342274 4692 net.cpp:74] Creating Layer Fc8
  110. I0721 10:38:17.342283 4692 net.cpp:84] Fc8 <-fc7
  111. I0721 10:38:17.342291 4692 net.cpp:110] Fc8-FC8
  112. I0721 10:38:17.343199 4692 net.cpp:125] Top shape:256 22 1 1 (5632)
  113. I0721 10:38:17.343214 4692 net.cpp:151] fc8 needs backward computation.
  114. I0721 10:38:17.343231 4692 net.cpp:74] Creating Layer loss
  115. I0721 10:38:17.343240 4692 net.cpp:84] loss <-FC8
  116. I0721 10:38:17.343250 4692 net.cpp:84] loss <-label
  117. I0721 10:38:17.343264 4692 net.cpp:151] loss needs backward computation.
  118. I0721 10:38:17.343305 4692 net.cpp:173] Collecting learning rate and Weight Decay.
  119. I0721 10:38:17.343327 4692 net.cpp:166] Network initialization done.
  120. I0721 10:38:17.343335 4692 net.cpp:167] Memory required for Data 1073760256

"Turn" [Caffe] alexnet interpretation of image classification model of deep learning

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.