Caffe: Train your network with your own data mnist

Source: Internet
Author: User
Tags shuffle

Software to draw white on black:kolourpaint.

Assume that all "1" images are placed under the folder named 1. (0-9 similar).. Gets the name of each number file after the label on the manual table. Then synthesize the train. Txt

1. Get the name of all the images in the folder:

Find./1-name ' *.png ' >1.txt

In this case, the image name in the 1.txt file includes the road strength information, to remove the previous path information.

$ sudo sed-i "s/.\/1\///g" 1.txt//(\ = Escaped, so here are double quotes instead of single quotes)

2. Add a label after each name within 1.txt

1. txt:

1101.png 1

1102.png 1

..... So

3. Convert picture data to Lmdb format data

Caffe/examples build a file to save training files for: sd_mnist

3.1 Sd_mnist Create a sd_create_lmdb.sh to convert the picture format:

sudo vim sd_create_lmdb.sh, content as follows:

#!/usr/bin/env SH
# Create the imagenet lmdb inputs
# n.b. Set the path to the Imagenet train + Val Data dirs


Example=examples/sd_mnist (! Note: This is the directory you created under examples)
Data=data/sd_mnist (! Note: You create a new directory under the Data folder with two picture sets (training and test training set) and two txt above).
Tools=build/tools


train_data_root=data/sd_mnist/train/(! Note: The path to the training picture set)
val_data_root=data/sd_mnist/test/(! Note: Just test the picture set path)


# Set Resize=true to RESIZE the images to 256x256. Leave as False if images has
# already been resized using another tool.
Resize=true
if $RESIZE; Then
Resize_height=28
Resize_width=28
Else
Resize_height=0
Resize_width=0
Fi


if [!-d "$TRAIN _data_root"]; Then
echo "Error:train_data_root is not a path to a directory: $TRAIN _data_root"
echo "Set the Train_data_root variable in create_imagenet.sh to the path" \
"Where the ImageNet training data is stored."
Exit 1
Fi


if [!-d "$VAL _data_root"]; Then
echo "Error:val_data_root is not a path to a directory: $VAL _data_root"
echo "Set the Val_data_root variable in create_imagenet.sh to the path" \
"Where the ImageNet validation data is stored."
Exit 1
Fi


echo "Creating train Lmdb ..."


Glog_logtostderr=1 $TOOLS/convert_imageset \
--resize_height= $RESIZE _height \
--resize_width= $RESIZE _width \
--shuffle \
$TRAIN _data_root \
$DATA/train.txt \ (! Pay attention to road strength)
$EXAMPLE/mnist_train_lmdb


echo "Creating Test Lmdb ..."


Glog_logtostderr=1 $TOOLS/convert_imageset \
--resize_height= $RESIZE _height \
--resize_width= $RESIZE _width \
--shuffle \
$VAL _data_root \
$DATA/test.txt \ (! Pay attention to road strength)
$EXAMPLE/mnist_test_lmdb


echo "Done."

-----------------------------------------------------------------------

3.2 Running sh example/sd_mnist/sd_create_lmdb.sh

If successful, the information returned by the terminal, the image is of a size rather than 0kb. And there will be two files under Examples/sd_mnist: Mnist_train_lmdb,mnist_test_lmdb they are data.mdb and Lock.mdb.

4, training Our data set: The following files are copied from caffe\examples\mnist to caffe\examples\sd_mnist down for modification. The main is to modify the path information, the entire network remains unchanged.

4.1 The first SH file is Train_lenet,sh

#!/usr/bin/env SH
Set-e

./build/tools/caffe train--solver=examples/sd_mnist/lenet_solver.prototxt [email protected]

4.2, copy the Lenet_solver.prototxt file, and modify:

# The Train/test net protocol buffer definition
Net: "Examples/sd_mnist / Lenet_train_test.prototxt "
# Test_iter Specifies how many forward passes the test should carry out.
# of MNIST, we have test batch size and test iterations,
# covering the full testing images.
test_iter:100
# Carry out testing every training iterations.
test_interval:500
# The base learning rate, momentum and the weight decay of the network.
base_lr:0.01
momentum:0.9
weight_decay:0.0005
# The Learning rate policy
Lr_policy: "INV"
Gamma: 0.0001
power:0.75
# Display every iterations
display:100
# The maximum number of iterations
Max_iter : 10000
# Snapshot Intermediate results
snapshot:5000
snapshot_prefix: " Examples/sd_mnist/lenet "
# Solver mode:cpu or GPU
Solver_mode:cpu

4.3. Lenet_train_test.prototxt Copy from the Mnist folder to the current folder

modifying paths

Name: "LeNet"
Layer {
Name: "Mnist"
Type: "Data"
Top: "Data"
Top: "Label"
Include {
Phase:train
}
Transform_param {
scale:0.00390625
}
Data_param {
Source: "Examples/sd_mnist/mnist_train_lmdb"
Batch_size:64
Backend:lmdb
}
}
Layer {
Name: "Mnist"
Type: "Data"
Top: "Data"
Top: "Label"
Include {
Phase:test
}
Transform_param {
scale:0.00390625
}
Data_param {
Source: "Examples/sd_mnist/mnist_test_lmdb"
batch_size:100
Backend:lmdb
}
}
Layer {
Name: "Conv1"
Type: "Convolution"
Bottom: "Data"
Top: "Conv1"
param {
Lr_mult:1
}
param {
Lr_mult:2
}
Convolution_param {
Num_output:20
Kernel_size:5
Stride:1
Weight_filler {
Type: "Xavier"
}
Bias_filler {
Type: "Constant"
}
}
}
Layer {
Name: "Pool1"
Type: "Pooling"
Bottom: "Conv1"
Top: "Pool1"
Pooling_param {
Pool:max
Kernel_size:2
Stride:2
}
}
Layer {
Name: "Conv2"
Type: "Convolution"
Bottom: "Pool1"
Top: "Conv2"
param {
Lr_mult:1
}
param {
Lr_mult:2
}
Convolution_param {
Num_output:50
Kernel_size:5
Stride:1
Weight_filler {
Type: "Xavier"
}
Bias_filler {
Type: "Constant"
}
}
}
Layer {
Name: "Pool2"
Type: "Pooling"
Bottom: "Conv2"
Top: "Pool2"
Pooling_param {
Pool:max
Kernel_size:2
Stride:2
}
}
Layer {
Name: "Ip1"
Type: "Innerproduct"
Bottom: "Pool2"
Top: "Ip1"
param {
Lr_mult:1
}
param {
Lr_mult:2
}
Inner_product_param {
num_output:500
Weight_filler {
Type: "Xavier"
}
Bias_filler {
Type: "Constant"
}
}
}
Layer {
Name: "RELU1"
Type: "ReLU"
Bottom: "Ip1"
Top: "Ip1"
}
Layer {
Name: "IP2"
Type: "Innerproduct"
Bottom: "Ip1"
Top: "IP2"
param {
Lr_mult:1
}
param {
Lr_mult:2
}
Inner_product_param {
Num_output:10
Weight_filler {
Type: "Xavier"
}
Bias_filler {
Type: "Constant"
}
}
}
Layer {
Name: "Accuracy"
Type: "Accuracy"
Bottom: "IP2"
Bottom: "Label"
Top: "Accuracy"
Include {
Phase:test
}
}
Layer {
Name: "Loss"
Type: "Softmaxwithloss"
Bottom: "IP2"
Bottom: "Label"
Top: "Loss"
}

4.4 lenet.prototxt Copy from Mnist folder to current folder without modification

4.5 Running SH example/sd_mnist/train_lenet.sh

No error, come out accuracy loss these, illustrate success!!

Reference: http://blog.csdn.net/xiaoxiao_huitailang/article/details/51361036

Caffe: Train your network with your own data mnist

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.