This article is using MATLAB to visualize Caffemodel convolution cores. Only the visualization of convolutional cores is introduced, and the visualization of feature graphs is not involved.
is reference this blog: http://blog.csdn.net/zb1165048017/article/details/52643188
Pre-prep, two things required
1. Model description File Deploy.prototxt
2. The model itself Lenet_iter_10000.caffemodel (in the mnist of the examples used here)
The first step:
Create a visualizing.m under Create D:\caffe-master\matlab\demo
ClcClearAddpath (‘..‘)% Join +caffe PathCaffe.Set_mode_cpu ();% Set CPU ModeModel = ' d:/caffe-master/examples/mnist/lenet.prototxt '; % model description weights = ' D:/ Caffe-master/examples/mnist/lenet_iter_10000.caffemodel '; % parameter net = Caffe. net (model, ' test '); Span class= "hljs-comment" >% read Netweight_partvisual (net, 1,1) % call partial display function Weight_partvisual (Net,layer_num, channels_num) % Layer_num is the first of several convolution layers, channels_num represents % shows the convolution core of the first channel, the value range is (0, the number of features on the previous layer)
Step Two:
Create a weight_partvisual.m under Create D:\caffe-master\matlab\demo
function[ ] =Weight_partvisual(Net,layer_num, Channels_num) layers=Net.layer_names; Convlayer=[];ForI=1:Length (layers)If strcmp (layersI (1:3),' Con ') convlayer=[Convlayer;layers{i}];EndEnd w=Net.layers (Convlayer (Layer_num,:)). Params (1). Get_data (); b=Net.layers (Convlayer (Layer_num,:)). Params (2). Get_data (); W=w-min (w (:)); W=w/max (W (:)) *255; Weight=w (:,:, Channels_num,:);% four, core length * core width * Core Left input * core right output (number of cores)[kernel_r,kernel_c,input_num,kernel_num]=Size (w); map_row=Ceil (sqrt (kernel_num));% row number Map_col=map_row;% of Columns weight_map=Zeros (Kernel_r*map_row,kernel_c*map_col); kernelcout_map=1;ForI=0:map_row-1Forj=0:map_col-1If Kernelcout_map<=kernel_num Weight_map (i*kernel_r+1 +I: (i+1) *kernel_r+Ij*kernel_c+1 +J: (j+1) *kernel_c+j) =weight (:,:,:, Kernelcout_map); kernelcout_map=kernelcout_map+1; End End End Figure haxe=axes (' Parent ', GCF,... % Set the new axe, set the ' parent ' property to the current window GCF ' Units ',' pixels ',... The % setting unit is pixels ' Position ',[ 0 605 705]); % Specifies the position of the axe left and bottom set the coordinates of the axe, the width and height of the window are set to the widths and heights axes (hAxe); Imshow (Uint8 (Weight_map)) Str1=strcat ( ' weight num: ', NUM2STR (kernelcout_map-1)); title (str1) end
Run VISUALIZING.M
Results
I don't see any regularity, is it because the mnist image is too small? As the model input is 256*256 when training imagenet, the resulting convolution cores appear to have some regularity (like edges).
PS: Here is the weighted value (W-min (w)/MAX (W)) *255
This principle does not understand, if you have a clear classmate tell me, thx~
"Caffe-windows" Caffe-master convolution core visualization (using MATLAB)