Water meter Training

Source: Internet
Author: User

Http://caffe.berkeleyvision.org/gathered/examples/mnist.html

Problem: The probability of lenet output is always a 1

Solution: Use the Softmax front layer, and then normalized to 0-1, as if this problem is still not solved.

In fact, we need to solve two problems:

A. Output probability

B. Remove some scanned pictures that are obviously not numbers and do not display them.

Now with the previous layer can solve the problem of the output probability, but because the input of any picture will output a 0,1 so if the input picture is not a number, then still output a larger number. And can't throw away these pictures.

Ask GQ this question. (Reply: A positive sample of the maximum output can be more than 6000, negative samples for the time being no more than 3000, temporarily make a threshold for small remove, is not a good way to deal with)

We basically trained three networks: the output is 0,1

1. Use the latest caffe-master below examples mnist below. Lenet's prototxt is like this.

Name: "Lenet" Input: "Data" Input_shape {dim:1 dim:1 dim:28 dim:28} layer {name: "CONV1" type: "Convolu
    tion "bottom:" Data "Top:" Conv1 "param {lr_mult:1} param {Lr_mult:2} convolution_param { Num_output:20 kernel_size:5 stride:1 Weight_filler {type: "Xavier"} bias_filler {Typ
    E: ' Constant '}} layer {name: ' pool1 ' type: ' Pooling ' bottom: ' conv1 ' top: ' Pool1 ' Pooling_param { Pool:max kernel_size:2 Stride:2}} layer {name: ' conv2 ' type: ' convolution ' bottom: ' pool1 ' top: '  Conv2 ' param {lr_mult:1} param {Lr_mult:2} convolution_param {num_output:50 kernel_size:  5 stride:1 Weight_filler {type: ' Xavier '} bias_filler {type: ' Constant '}}} layer
    {name: ' pool2 ' type: ' Pooling ' bottom: ' conv2 ' top: ' pool2 ' pooling_param {pool:max kernel_size:2 Stride:2}} Layer {name: ' ip1 ' type: ' innerproduct ' bottom: ' pool2 ' top: ' ip1 ' param {lr_mult:1} param {lr_mult:
      2} inner_product_param {num_output:500 weight_filler {type: ' Xavier '} bias_filler {
  Type: ' Constant '}} layer {name: ' relu1 ' type: ' Relu ' bottom: ' ip1 ' top: ' ip1 '} layer {name: ' IP2 ' Type: "Innerproduct" bottom: "ip1" Top: "ip2" param {lr_mult:1} param {Lr_mult:2} inner_produ
    Ct_param {num_output:20 Weight_filler {type: ' Xavier '} bias_filler {type: ' constant ' }} layer {name: ' Prob ' type: ' Softmax ' bottom: ' ip2 ' top: ' Prob '}

2. Find a Mnist profile on the web. The size of the input picture is 32 *32.


Name: "Lenet" Input: "Data" input_dim:1 input_dim:1 input_dim:32 input_dim:32 Layer {name: "CONV1" type: "Convolu
    tion "bottom:" Data "Top:" Conv1 "param {lr_mult:1} param {Lr_mult:2} convolution_param { Num_output:6 kernel_size:5 stride:1 Weight_filler {type: ' Xavier '} bias_filler {type
    : ' Constant '}} layer {name: ' pool1 ' type: ' Pooling ' bottom: ' conv1 ' top: ' Pool1 ' Pooling_param { Pool:max kernel_size:2 Stride:2}} layer {name: "Conv2" type: "Convolution" bottom: "pool1" Top: "C" Onv2 ' param {lr_mult:1} param {Lr_mult:2} convolution_param {num_output:16 kernel_size:  Stride:1 Weight_filler {type: ' Xavier '} bias_filler {type: ' Constant '}}} layer  {name: ' ip1 ' type: ' innerproduct ' bottom: ' conv2 ' top: ' ip1 ' param {lr_mult:1} param {lr_mult: 2} inner_product_pAram {num_output:120 Weight_filler {type: ' Xavier '} bias_filler {type: ' constant '}
  } Layer {name: "RELU1" type: "Relu" bottom: "ip1" Top: "ip1"} layer {name: "IP2" type: "Innerproduct" Bottom: "ip1" Top: "ip2" param {lr_mult:1} param {Lr_mult:2} inner_product_param {NUM_OUTPU t:84 Weight_filler {type: "Xavier"} bias_filler {type: "Constant"}}} layer {name: ' RELU2 ' type: ' Relu ' bottom: ' ip2 ' top: ' ip2 '} layer {name: ' IP3 ' type: ' innerproduct ' bottom: ' ip2 ' top: ' IP3 ' param {lr_mult:1} param {Lr_mult:2} inner_product_param {num_output:20 Weight_fill Er {type: ' Xavier '} bias_filler {type: ' Constant '}}} layer {name: ' Prob ' type: ' SOFTM Ax "bottom: IP3" top: "Prob"}

3. The mnist is Lenet2 after the mean value has been added.

The issues to be noted:

A. In the use of MATLAB to feature the time we want to prototxt inside the number

Input_dim:1
input_dim:1
input_dim:32
input_dim:32
B. Matlab features, to mention that layer of features to remove all the layers after. For example, we want to softmax the characteristics of the previous layer, to softmax this layer completely removed.



Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.