Windows Caffe Running Mnist instance

Source: Internet
Author: User

First of all, Caffe of course to run to run their own demo, under the examples folder.

First try the mnist for handwritten digit recognition, there are required code files under examples/mnist/, but there is no image library.

The Mnist library has 50,000 training samples and 10,000 test samples, all of which are handwritten digital images.

The data format supported by Caffe is: LMDB LEVELDB

IMDB is larger than leveldb, but is faster and allows multiple training models to read the same data set at the same time.

By default, examples supports the IMDb file, but you can change it to LEVELDB, which is explained later.

Mnist Data Set recommended online search download, the network disk has a lot of attention to the folder in the \examples\mnist directory, and preferably named in the figure format,

Otherwise, you may not be able to read the file manually.

The author before the dataset named after the underscore is the connection line will be error can not read the file, so pay attention to the folder name!

It's best to choose Leveldb files under Windows, and Linux is free. It's okay. leveldb file no longer uses the Convert_imageset function, eliminating the need to convert the image format and calculate the mean value.

Two. Training Mnist model

Mnist's network training model file is: Lenet_train_test.prototxt

Name"LeNet"Layer {name:"mnist"Type:"Data"Top:"Data"Top:"label"include {Phase:train} transform_param {scale:0.00390625} data_param {Source:"Examples/mnist/mnist_train_leveldb"batch_size: -Backend:leveldb}} Layer {name:"mnist"Type:"Data"Top:"Data"Top:"label"include {phase:test} transform_param {scale:0.00390625} data_param {Source:"Examples/mnist/mnist_test_leveldb"batch_size: -Backend:leveldb}} Layer {name:"CONV1"Type:"convolution"Bottom:"Data"Top:"CONV1"param {lr_mult:1} param {lr_mult:2} convolution_param {num_output: -kernel_size:5Stride:1Weight_filler {type:"Xavier"} bias_filler {type:"constant"}}}layer {name:"Pool1"Type:"Pooling"Bottom:"CONV1"Top:"Pool1"Pooling_param {Pool:max kernel_size:2Stride:2}}layer {name:"Conv2"Type:"convolution"Bottom:"Pool1"Top:"Conv2"param {lr_mult:1} param {lr_mult:2} convolution_param {num_output: -kernel_size:5Stride:1Weight_filler {type:"Xavier"} bias_filler {type:"constant"}}}layer {name:"pool2"Type:"Pooling"Bottom:"Conv2"Top:"pool2"Pooling_param {Pool:max kernel_size:2Stride:2}}layer {name:"ip1"Type:"innerproduct"Bottom:"pool2"Top:"ip1"param {lr_mult:1} param {lr_mult:2} inner_product_param {num_output: -Weight_filler {type:"Xavier"} bias_filler {type:"constant"}}}layer {name:"RELU1"Type:"ReLU"Bottom:"ip1"Top:"ip1"}layer {name:"IP2"Type:"innerproduct"Bottom:"ip1"Top:"IP2"param {lr_mult:1} param {lr_mult:2} inner_product_param {num_output:TenWeight_filler {type:"Xavier"} bias_filler {type:"constant"}}}layer {name:"accuracy"Type:"accuracy"Bottom:"IP2"Bottom:"label"Top:"accuracy"include {phase:test}}layer {name:"Loss"Type:"Softmaxwithloss"Bottom:"IP2"Bottom:"label"Top:"Loss"}

Generally modify the "source" file path of the two data layer on the line, the above example, I have changed, changed to Mnist training set and test set folder path. Then note "Backend:leveldb", the default backend should be IMDB to modify!

Network model Lenet_train_test.prototxt modified and then modified Lenet_solver.prototxt

This file is primarily a number of learning parameters and strategies:

  

1# The Train/Test NET protocol buffer definition2Net"Examples/mnist/lenet_train_test.prototxt"3# TEST_ITER Specifies how many forward passes the test should carry out.4# in the Caseof MNIST, we have test batch size -and -Test iterations,5# covering the fullTen, thetesting images.6Test_iter: -7# Carry outTesting every -training iterations.8Test_interval: -9# theBaselearning rate, momentum and the weight decay of the network.TenBASE_LR:0.01 OneMomentum0.9 AWeight_decay:0.0005 - # The Learning rate policy -Lr_policy:"INV" theGamma0.0001 -Power0.75 -# Display every -iterations -Display - + # The maximum number of iterations -Max_iter:10000 + # Snapshot Intermediate results ASnapshot the atSnapshot_prefix:"examples/mnist/lenet" - # Solver mode:cpu or GPU -Solver_mode:cpu

Notes with # can be interpreted with the best possible understanding:

The second line of net: The path needs to be changed to its own network model Xx_train_test.prototxt path. Other learning rate base_lr,lr_policy such as not recommended modification; Max_iter The maximum number of iterations can be slightly smaller, display interval can also be arbitrarily modified ~ The last line, I am only CPU mode so set to CPU, if you can use GPU acceleration can be set to gpu!

This is the end of the basic setup, and then the Write command executes the test program:

I chose to write a batch. bat file execution, or directly in the CMD Environment Output command execution.

Create a new mnist_train.bat with the following content:

Cd.. /.. /"build/x64/debug/caffe.exe" train--solver=examples/mnist/

Modify the path location of the second line according to your own situation, Windows should be in the build/x64 directory, some blog write/bin/directory is actually Linux does not apply to the Windows environment. Also pay attention to using the slash "/", do not use "\" Unrecognized, Python code more for the latter to modify!

My environment only has debug directory, if you have realease directory, use Realease directory.

After the. Bat is successful, the training begins and the end of the training interface is as follows:

The last few lines can be seen accuracy accuracy rate can reach 99%, is quite accurate!

Prompt, the. caffemodel file is generated within the Caffe folder

Start the test using the Caffemodel file:

Three. Test data

Since the test data set is also a direct download of the Leveldb file, it saves a lot of steps

Create a new Mnist_test.bat file directly, similar to the training mnist model, to test the model data.

Cd.. /.. /"build/x64/debug/caffe.exe" test--model=examples/mnist/lenet_train_test.prototxt- weights=examples/mnist/

Similar to Mnits_train.bat, modify the file pathname, test indicates for testing, model points to its own network model file, and finally add a weight file. Caffemodel to test.

After running Mnist_test.bat, the successful interface is as follows:

  

The last line still has a 98% accuracy rate is still very good, show that the model generated is also good.

  

Summary: Actually also encountered a lot of fragmented problems, most of them can be Baidu to solve, mainly remember to modify their own file path directory, Windows must use LEVELDB data files,. Prototxt also remember to modify, and then wait for the model to run through the results, See the high accuracy rate is very happy ~

Four. Use the model

Model training Well, the data is only tested, then we have to use the model to determine how a picture is a number of how to do it?

This time you need to generate Classification.exe, and then execute the corresponding. bat command to predict the classification results of the picture.

Mnist classification use can refer to http://www.cnblogs.com/yixuan-xu/p/5862657.html

Discover OPENCV can load Caffe framework model, ready to write a blog for practice introduction ~

Http://docs.opencv.org/3.1.0/d5/de7/tutorial_dnn_googlenet.html

Windows Caffe Running Mnist instance

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.