Data through the data layer into the Caffe network: The data layer is at the bottom of the network, the data can be read from a high-efficiency database (such as: LevelDB, LMDB), can be read directly from memory, if the requirements for reading and writing efficiency is not high can also be read from the hard disk hdft files or ordinary picture files.
The data comes from the database:
Layer Type: Data
Parameters that must be set:
Source: The directory name that contains the database, such as Examples/mnist/mnist_train_lmdb
Batch_size: The number of data processed each time, such as 64
Optional Parameters:
Rand_skip: At the beginning, passing in the input of some data. This is often useful for asynchronous SGD.
Backend: Leveldb,lmdb can be used, default is LEVELDB.
Example:
Layer {
name: "Mnist"
Type: "Data"
Top: "Data"
Top: "label"
include {
phase:train
}
Transform_param {
scale:0.00390625
}
data_param {
Source: "Examples/mnist/mnist_train_lmdb"
batch_size:64
backend:lmdb
}
}
The data comes from memory:
Layer Type: Memorydata
Parameters that must be set:
Batch_size: The number of data processed each time, such as 2
Channels: Number of channels
Height: High
Width: Wide
Example:
Layer {
top: "Data"
Top: "Label"
Name: "Memory_data"
type: "Memorydata"
memory_data_param{
Batch_size:2
height:100
width:100
channels:1
}
transform_param {
scale:0.0078125
mean_file: "Mean.proto"
mirror:false
}
}
The data comes from HDF5:
Layer Type: Hdf5data
Parameters that must be set:
Source: Read file name
Batch_size: Number of data processed each time
Example:
Layer {
Name: "Data"
type: "Hdf5data"
Top: "Data"
Top: "Label"
hdf5_data_param {
Source: " Examples/hdf5_classification/data/train.txt "
batch_size:10
}
}
The data comes from the picture:
Layer Type: ImageData
Parameters that must be set:
Source: The name of a text file, each line given a picture file names and labels (label)
Batch_size: Number of data processed each time, that is, pictures
Optional Parameters:
Rand_skip: At the beginning, passing in the input of some data. This is often useful for asynchronous SGD.
Shuffle: Random shuffle Order, default value False
New_height,new_width: If set, the picture is resize
Example:
Layer {
Name: "Data"
type: "ImageData"
Top: "Data"
Top: "Label"
transform_param {
Mirror: False
crop_size:227
mean_file: "Data/ilsvrc12/imagenet_mean.binaryproto"
}
Image_data_param {
Source: "Examples/_temp/file_list.txt"
batch_size:50
new_height:256
new_width:256
}
}
The data comes from Windows:
Layer Type: Windowdata
Parameters that must be set:
Source: The name of a text file
Batch_size: Number of data processed each time, that is, pictures
Example:
Layer {
Name: "Data"
type: "Windowdata"
Top: "Data"
Top: "label"
include {
phase:train
}
Transform_param {
mirror:true
crop_size:227
mean_file: "data/ilsvrc12/imagenet_ Mean.binaryproto "
}
Window_data_param {
Source:" Examples/finetune_pascal_detection/window_file_ 2007_trainval.txt "
batch_size:128
fg_threshold:0.5
bg_threshold:0.5
fg_fraction:0.25
Context_pad:16
Crop_mode: "Warp"
}
}
The data comes from dummy: Dummydata is mainly used for development and commissioning.