Caffe How to write picture data into Lmdb format

Source: Internet
Author: User
Tags shuffle

I also began to use Caffe with a good environment ran mnist and another example, at that time, if it is a picture of the example should be how to import, later looked at the two days finally the whole understanding of the specific process is very simple. I use the Minst database to do an experiment.

First we have to get two txt one is train.txt another is test.txt

The contents are as follows:



The front represents the name of the image in the same directory. The second number represents the label Test.txt Train.txt

Here is the MATLAB code

CLC
Clear
Load (' Mnist_uint8.mat ')
Num=size (test_x,1);
FID = fopen (' test_minst.txt ', ' wt ');
Savepath= ' testimage/';
%train_y=interge (train_y);
For I=1:num
Image=reshape (test_x (i,:), [28 28]);
Label=find (test_y (i,:) ~=0)-1;
If i<10
Imagename=strcat (' test_0000 ', num2str (i));
End
If i<100&&i>9
Imagename=strcat (' test_000 ', num2str (i));
End
If i<1000&&i>99
Imagename=strcat (' test_00 ', num2str (i));
End
If i<10000&&i>999
Imagename=strcat (' Test_0 ', num2str (i));
End
If i>9999
Imagename=strcat (' Test_ ', num2str (i));
End
Imagename=strcat (ImageName, '. jpg ');
Imagepath=strcat (Savepath,imagename);
fprintf (FID, '%s\t ', ImagePath);
fprintf (FID, '%s\n ', num2str (label));
Imwrite (image,imagepath, ' jpg ');

End
Fclose (FID);

So we get



So we get two folders and two txt

Next, create a new directory under Caffe example. As for what you decide my name Newmnist folder after I will test the image set and training image set and two txt import Caffe directory under the data directory of a subfolder, this I forgot,

Create a new sh in the Example/newmnist folder

The goal is to write the picture set in Lmdb format

SH is as follows:

#!/usr/bin/env SH
# Create the imagenet lmdb inputs
# n.b. Set the path to the Imagenet train + Val Data dirs


example=examples/newmnist
Data=data/testmnist
Tools=build/tools


train_data_root=data/testmnist/image/
Val_data_root=data/testmnist/testimage/


# Set Resize=true to RESIZE the images to 256x256. Leave as False if images has
# already been resized using another tool.
Resize=true
if $RESIZE; Then
Resize_height=28
Resize_width=28
Else
Resize_height=0
Resize_width=0
Fi


if [!-d "$TRAIN _data_root"]; Then
echo "Error:train_data_root is not a path to a directory: $TRAIN _data_root"
echo "Set the Train_data_root variable in create_imagenet.sh to the path" \
"Where the ImageNet training data is stored."
Exit 1
Fi


if [!-d "$VAL _data_root"]; Then
echo "Error:val_data_root is not a path to a directory: $VAL _data_root"
echo "Set the Val_data_root variable in create_imagenet.sh to the path" \
"Where the ImageNet validation data is stored."
Exit 1
Fi


echo "Creating train Lmdb ..."


Glog_logtostderr=1 $TOOLS/convert_imageset \
--resize_height= $RESIZE _height \
--resize_width= $RESIZE _width \
--shuffle \
$TRAIN _data_root \
$DATA/train_minst.txt \
$EXAMPLE/mnist_train_lmdb


echo "Creating Test Lmdb ..."


Glog_logtostderr=1 $TOOLS/convert_imageset \
--resize_height= $RESIZE _height \
--resize_width= $RESIZE _width \
--shuffle \
$VAL _data_root \
$DATA/test_minst.txt \
$EXAMPLE/mnist_test_lmdb


echo "Done."




The place where I marked red is where I need to change.

Example is example your new directory.

The data is your new directory under the Data folder with two picture sets (training and test training set) and two txt mentioned above

Train_data_root is the training picture set path

Val_data_root is the training picture set path

Run SH example/newmnist/creatmnistlmdb.sh

You can find the Mnist_test_lmdb and mnist_train_lmdb two folders under example/newmnist/.



To verify that we are getting the Lmdb file correctly

We do mnist data set training

The first sh file is Train_lenet,sh

#!/usr/bin/env SH


./build/tools/caffe train--solver=examples/newmnist/lenet_solver.prototxt

modifying paths

Create a Lenet_solver.prototxt file

# the train/test Net protocol buffer definition
Net: " Examples/newmnist/lenet_train_test.prototxt "
# Test_iter Specifies how many forward passes the test should Carry out.
# in the case of MNIST, we have test batch size and test iterations,
# covering the full testing IMA Ges.
test_iter:100
# Carry out testing every training iterations.
test_interval:500
# The base learning rate, momentum and the weight decay of the network.
base_lr:0.01
momentum:0.9
weight_decay:0.0005
# The Learning rate policy
Lr_policy: "INV"
Gamma: 0.0001
power:0.75
# Display every iterations
display:100
# The maximum number of iterations
Max_iter : +
# Snapshot Intermediate results
snapshot:500
Snapshot_prefix: "examples/newmnist/lenet "
# Solver mode:cpu or GPU
Solver_mode:gpu


Red is the path that needs to be modified

Lenet_train_test.prototxt copying from the Mnist folder to the current folder

modifying paths

Name: "LeNet"
Layer {
Name: "Mnist"
Type: "Data"
Top: "Data"
Top: "Label"
Include {
Phase:train
}
Transform_param {
scale:0.00390625
}
Data_param {
Source: "Examples/newmnist/mnist_train_lmdb"
Batch_size:64
Backend:lmdb
}
}
Layer {
Name: "Mnist"
Type: "Data"
Top: "Data"
Top: "Label"
Include {
Phase:test
}
Transform_param {
scale:0.00390625
}
Data_param {
Source: "Examples/newmnist/mnist_test_lmdb"
batch_size:100
Backend:lmdb
}
}
Layer {
Name: "Conv1"
Type: "Convolution"
Bottom: "Data"
Top: "Conv1"
 param {
Lr_mult:1
}
param {
Lr_mult:2
}
Convolution_param {
Num_output:20
Kernel_size:5
Stride:1
Weight_filler {
Type: "Xavier"
}
Bias_filler {
Type: "Constant"
}
}
}
Layer {
Name: "Pool1"
Type: "Pooling"
Bottom: "Conv1"
Top: "Pool1"
Pooling_param {
Pool:max
Kernel_size:2
Stride:2
}
}
Layer {
Name: "Conv2"
Type: "Convolution"
Bottom: "Pool1"
Top: "Conv2"
param {
Lr_mult:1
}
param {
Lr_mult:2
}
Convolution_param {
Num_output:50
Kernel_size:5
Stride:1
Weight_filler {
Type: "Xavier"
}
Bias_filler {
Type: "Constant"
}
}
}
Layer {
Name: "Pool2"
Type: "Pooling"
Bottom: "Conv2"
Top: "Pool2"
Pooling_param {
Pool:max
Kernel_size:2
Stride:2
}
}
Layer {
Name: "Ip1"
Type: "Innerproduct"
Bottom: "Pool2"
Top: "Ip1"
param {
Lr_mult:1
}
param {
Lr_mult:2
}
Inner_product_param {
num_output:500
Weight_filler {
Type: "Xavier"
}
Bias_filler {
Type: "Constant"
}
}
}
Layer {
Name: "RELU1"
Type: "ReLU"
Bottom: "Ip1"
Top: "Ip1"
}
Layer {
Name: "IP2"
Type: "Innerproduct"
Bottom: "Ip1"
Top: "IP2"
param {
Lr_mult:1
}
param {
Lr_mult:2
}
Inner_product_param {
Num_output:10
Weight_filler {
Type: "Xavier"
}
Bias_filler {
Type: "Constant"
}
}
}
Layer {
Name: "Accuracy"
Type: "Accuracy"
Bottom: "IP2"
Bottom: "Label"
Top: "Accuracy"
Include {
Phase:test
}
}
Layer {
Name: "Loss"
Type: "Softmaxwithloss"
Bottom: "IP2"
Bottom: "Label"
Top: "Loss"
}

Marked Red Place Modification

Last sh example/newmnist/train_lenet,sh


The Yun are all red.

Caffe How to write picture data into Lmdb format

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.