The road of Computer Vision Caffe Fourth: VOC2007 data set training and prediction examples

Source: Internet
Author: User
1. Preparatory work

1). Download the pre-trained model

Http://cs.unc.edu/~wliu/projects/ParseNet/VGG_ILSVRC_16_layers_fc_reduced.caffemodel

Put this file in the/home/software/caffe/models/vggnet/directory, if so, it is best to back up.

2). Download the VOC2007 data set

wget  Http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar

wget/http Host.robots.ox.ac.uk/pascal/voc/voc2007/voctrainval_06-nov-2007.tar

wget http://host.robots.ox.ac.uk/pascal /voc/voc2007/voctest_06-nov-2007.tar

The main application of Voctrainval_06-nov-2007.tar, Voctest_06-nov-2007.tar, two files extracted by Vocdevkit.

3). Merge the Trainval and test of the VOC2007 dataset
Voctrainval_06-nov-2007.tar, Voctest_06-nov-2007.tar extracted Vocdevkit directly merged, then annotations, Imagesets, Jpegimages, Both Segmentationclass and segmentationobject contain files of the type Trainval, train, Var, and test.

4). Folder placement puts Vocdevkit in the/home/working/directory:
/home/working/vocdevkit

In the/home/software/caffe/data directory, copy the original VOC0712 folder back up, which contains create_data.sh, create_list.sh, LABELMAP_ Voc.prototxt and the generated test.txt, Test_name_size.txt, Trainval.txt:
/home/software/caffe/data/voc0712
/home/software/caffe/data/voc0712_bak

In the/home/software/caffe/examples directory, copy the original VOC0712 folder back up, which is used to place the generated Lmdb files:
/home/software/caffe/examples/voc0712
/home/software/caffe/examples/voc0712_bak

In the/home/software/caffe/models/vggnet directory, copy the original VOC0712 folder back up, which is used to place the generated SSD_300X300 Model folder:
/home/software/caffe/models/vggnet/voc0712
/home/software/caffe/models/vggnet/voc0712_bak

In the/home/software/caffe/jobs/vggnet directory, copy the original VOC0712 folder back up, which is used to place files such as generated vgg_voc0712__ssd_300x300.sh, when the/home/ When the Run_soon parameter in the ssd_pascal.py and other Python files in the software/caffe/examples/ssd/directory is changed to False, you need to manually execute the. sh file to start the Model training:
/home/software/caffe/jobs/vggnet/voc0712
/home/software/caffe/jobs/vggnet/voc0712_bak

In the/home/software/caffe/jobs/vggnet directory, create a new results folder, which is used to place the model training results, which is chosen for the purpose of the model training data centralized management:
/home/software/caffe/jobs/vggnet/results 2. Modifying various types of original file Paths

Because the first step creates a new user's own directory (in order not to destroy the original data), it is necessary to point the path modifications within each file to the user directory.
1). Modify the create_data.sh, create_list.sh in the/home/software/caffe/datavoc0712 directory

# create_list.sh
#!/bin/bash
# data_root_dir= "$HOME/data/vocdevkit"
Root_dir=/home/working/vocdevkit # # Self
-modifying Sub_dir=imagesets/main
bash_dir= "$ (CD" $ (dirname "${bash_source[0]}") "&& pwd" for
DataSet in Trainval test
do
  dst_file= $bash _dir/$dataset. txt
  if [-f $dst _file] then
    rm-f $dst _ File
  fi for
  name in VOC2017 # # Self modification
# create_data.sh
#!/bin/bash
cur_dir=$ (CD $ (dirname ${bash_source[0]}) && pwd)
root_dir= $cur _dir/. /..

CD $root _dir

redo=1
# data_root_dir= "$HOME/data/vocdevkit"
data_root_dir= "/home/working/vocdevkit "# # Self-modification
dataset_name=" BIRD2017 "

2)./home/software/caffe/examples/ssd/ssd_pascal.py Backup

/home/software/caffe/examples/ssd/ssd_pascal_bak.py
/home/software/caffe/examples/ssd/ssd_pascal.py

Modify the path and configuration:

Train_data = "Examples/voc0712/voc0712_trainval_lmdb" # Training data path, Test_data = "Examples/voc0712/voc0712_test_lmdb" # Test data path Model_name = "vgg_voc0712_{}". Format (job_name) # model name Save_dir = "models/vggnet/voc0712/{}". Format (job_name) # model Save path Snapshot_dir = "models/vggnet/voc0712/{}". Format (job_name) # Snapshot Snapshot save path Job_dir = "jobs/vggnet/voc0712/{}". Format (job_name) # job save Path Output_result_dir = "{}/data/vocdevkit/results/voc2007/{}/main". Format (os.environ[' HOME ' ], job_name) # test result txt save path changed to: Output_result_dir = "Jobs/vggnet/results/voc2007/{}/main". Format (job_name) name_size_ File = "Data/voc0712/test_name_size.txt" # label_map_file = "data/voc0712/labelmap_voc.prototxt" # label file path Num_classe s = 21 # total category Number GPUs = "0,1,2,3" # Use which GPU to change to: GPUs = "0" # only 1 GPU batch_size = 32 # Number of pictures processed at once batch_size = 2 # Jetson TX1
Tested selection 2 barely satisfies accum_batch_size = 2 # Evaluate on whole test set. Num_test_image = 4952 # test the number of pictures, this number should be consistent with test_name_size.txt test_batch_size = 2 # Jetson TX1 tested select 2 barely satisfied


Run_soon = True # automatically starts training after generating a file Run_soon = False # manual block, need to perform '/home/software/caffe/jobs/vggnet/voc0712/vgg_voc0712_ssd_30 0x300.sh '

Learning rate settings, learning rate should not be too small, I set BASE_LR to 0.000002, the learning rate is: base_lr *25 = 0.00005.

# Use different initial learning rate.
If use_batchnorm:
    base_lr = 0.0004
else:
    # A Learning rate for batch_size = 1, Num_gpus = 1.
    BASE_LR = 0.000002

Model Save period:

solver_para={·
' Snapshot ': 10000, #原来为80000, change the lowercase to save the model in time, or the error will start from 0 training
· •
}
3. Generating Files

Note: The following command execution must be performed under the Caffe root path, or an error will occur.

1). Run create_list.sh

CD Home/software/caffe
./data/voc0712/create_list.sh

Generated:

2). Run create_data.sh

CD Home/software/caffe
./data/voc0712/create_data.sh

Generate the Lmdb file:

3). Run ssd_pascal.py

CD Home/software/caffe
python examples/ssd/ssd_pascal.py
./jobs/vggnet/voc0712/vgg_voc0712_ssd_ 300x300.sh #如果run_soon为True, you do not need this step

This is the beginning of the training, the process is longer. 4. Model Testing

1). Test on picture Data set

CD Home/software/caffe
python examples/ssd/score_ssd_pascal.py

2). Webcam test:

Run ssd_pascal_webcam.py this file in the root directory of Caffe, which is the software that uses the webcam real-time test, read the Caffemodel is in caffe/models/vggnet/voc0712/ssd_300x300_ Webcam The latest model, so remember to put it in this folder. and change the next ssd_pascal_webcam.py label_map_file to your labelmap_voc.prototxt.

CD Home/software/caffe
python examples/ssd/ssd_pascal_webcam.py

3). ssd_detect.cpp File test:

./build/examples/ssd/ssd_detect.bin Models/vggnet/voc0712/ssd_300x300/deploy.prototxt Models/VGGNet/VOC0712/SSD_ 300x300/vgg_voc0712_ssd_300x300_iter_120000.caffemodel examples/videos/test.txt--file_type Video--out_file Output.txt--confidence_threshold 0.4



Reference documents:

SSD algorithm Caffe configuration, training and testing process
Ssd:single Shot multibox Detector Training Kitti Data Set (1)
Deep Learning SSD configuration and training your own data on the VGG model

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.