Alexnet interpretation of the image classification model of [Caffe] depth Learning

Source: Internet
Author: User

Original URL:

http://blog.csdn.net/sunbaigui/article/details/39938097


On the Imagenet Image Classification challenge the Alexnet network structure model which Alex proposed has won the 2012 championship. To study the application of the CNN type DL network model to the image classification, we can't escape the research alexnet, which is CNN's classic model on image classification (after DL fire).

In the model example of the DL open source implementation Caffe, it also gives the alexnet, the specific network configuration file is as follows Https://github.com/BVLC/caffe/blob/master/models/bvlc_ Reference_caffenet/train_val.prototxt: Next, this article will be step-by-step on the network configuration structure of each layer in a detailed interpretation (training phase):

1. CONV1 phase DFD (Data flow diagram):


2. Conv2 phase DFD (Data flow diagram):


3. Conv3 phase DFD (Data flow diagram):


4. CONV4 phase DFD (Data flow diagram):


5. CONV5 phase DFD (Data flow diagram):


6. Fc6 phase DFD (Data flow diagram):


7. FC7 phase DFD (Data flow diagram):


8. Fc8 phase DFD (Data flow diagram):



all kinds of layer operation more explanations can refer to http://caffe.berkeleyvision.org/tutorial/layers.html

In the process of calculating the data flow of the model, the model parameters are probably 5kw+.

Caffe's output also contains a log of the contents of this block, as detailed below:

[CPP]  View plain  copy  print? i0721 10:38:15.326920  4692 net.cpp:125] top shape: 256 3 227  227  (39574272)    i0721 10:38:15.326971  4692 net.cpp:125] top  shape: 256 1 1 1  (256)    i0721 10:38:15.326982  4692  net.cpp:156] data does not need backward computation.   I0721  10:38:15.327003  4692 net.cpp:74] Creating Layer conv1   I0721  10:38:15.327011  4692 net.cpp:84] conv1 <- data   i0721  10:38:15.327033  4692 net.cpp:110] conv1 -> conv1   I0721  10:38:16.721956  4692 net.cpp:125] top shape: 256 96 55 55  ( 74342400)    I0721&NBSP;10:38:16.722030&NBSP;&NBSP;4692&NBSP;NET.CPP:151]&NBSP;CONV1&Nbsp;needs backward computation.   I0721 10:38:16.722059  4692 net.cpp :74] creating layer relu1   i0721 10:38:16.722070  4692 net.cpp:84 ] relu1 <- conv1   i0721 10:38:16.722082  4692 net.cpp:98]  relu1 -> conv1  (in-place)    i0721 10:38:16.722096  4692  net.cpp:125] top shape: 256 96 55 55  (74342400)    I0721  10:38:16.722105  4692 net.cpp:151] relu1 needs backward computation.    i0721 10:38:16.722116  4692 net.cpp:74] creating layer pool1    i0721 10:38:16.722125  4692 net.cpp:84] pool1 <- conv1   i0721 10:38:16.722133  4692 net.cpp:110] pool1 -> pool1   I0721  10:38:16.722167  4692 net.cpp:125] top shape: 256 96 27 27  (17915904)    I0721 10:38:16.722187  4692 net.cpp:151] pool1 needs backward  computation.   i0721 10:38:16.722205  4692 net.cpp:74] creating  layer norm1   i0721 10:38:16.722221  4692 net.cpp:84] norm1 <-  pool1   i0721 10:38:16.722234  4692 net.cpp:110] norm1 ->  norm1   i0721 10:38:16.722251  4692 net.cpp:125] top shape:  256 96 27 27  (17915904)    i0721 10:38:16.722260  4692 net.cpp :151] norm1 needs backward computation.   i0721 10:38:16.722272   4692 net.cpp:74] creating layer conv2   i0721 10:38:16.722280   4692 net.cpp:84] conv2 <- norm1   i0721 10:38:16.722290  4692 net.cpp:110]  conv2 -> conv2   i0721 10:38:16.725225  4692 net.cpp:125]  Top shape: 256 256 27 27  (47775744)    i0721 10:38:16.725242   4692 net.cpp:151] conv2 needs backward computation.   I0721  10:38:16.725253  4692 net.cpp:74] Creating Layer relu2   I0721  10:38:16.725261  4692 net.cpp:84] relu2 <- conv2   i0721  10:38:16.725270  4692 net.cpp:98] relu2 -> conv2  (In-place)    I0721 10:38:16.725280  4692 net.cpp:125] top shape: 256 256 27  27  (47775744)    I0721&NBSP;10:38:16.725288&NBSP;&NBSP;4692&NBSP;NET.CPP:151]&NBSP;RELU2  needs backward computation.   i0721 10:38:16.725298  4692 net.cpp:74] creating  layer pool2   i0721 10:38:16.725307  4692 net.cpp:84] pool2 <-  conv2   i0721 10:38:16.725317  4692 net.cpp:110] pool2 ->  pool2   i0721 10:38:16.725329  4692 net.cpp:125] top shape:  256 256 13 13  (11075584)    i0721 10:38:16.725338  4692  net.cpp:151] pool2 needs backward computation.   I0721 10:38:16.725358   4692 net.cpp:74] Creating Layer norm2   i0721 10:38:16.725368   4692 net.cpp:84] norm2 <- pool2   i0721 10:38:16.725378   4692 net.cpp:110] norm2 -> norm2   i0721 10:38:16.725389  4692 &NBSP;NET.CPP:125] Top shape: 256 256 13 13  (11075584)    i0721 10:38:16.725399   4692 net.cpp:151] norm2 needs backward computation.   I0721  10:38:16.725409  4692 net.cpp:74] Creating Layer conv3   I0721  10:38:16.725419  4692 net.cpp:84] conv3 <- norm2   i0721  10:38:16.725427  4692 net.cpp:110] conv3 -> conv3   I0721  10:38:16.735193  4692 net.cpp:125] top shape: 256 384 13 13  (16613376)   

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.