Learning Strategy of TLD Dynamic Tracking System-P-N Learning

Source: Internet
Author: User
Tags tld

This article from http://blog.sina.com.cn/s/blog_80e381d101015fza.html

1 Overview

This article shows that the performance of the second-class classifier can be achieved through unlabeled dataStructuredTo improve the processing process, that is, if you know that the tag of a sample has restrictions on the tag of other samples, then the data is structured.

In this paper, we propose that P-N learning uses labeled and unlabeled samples to train the second-class classifier. The training process is guided by limiting the positive and negative constraints of unlabeled datasets. During each iteration, P-N learning evaluates the classifier in the unlabeled training set, recognizes the samples whose classification results are in conflict with the structure constraints, and then adds the corrected samples to the training set. In this paper, a theory is proposed to describe P-N learning to improve the performance of the initial classifier, And it is verified by artificial synthesis and real data.

P-N learning is applied to the online learning of the target detector during the tracking process. In this paper, a precise target detector can be trained from a single sample and a video sequence with a possible target.

 

Contribution of this paper: In the offline and online learning problems, the formula (formalize) P-N learning model.

 

2. Basic Concepts and Methods in this article:

Semi-Supervised Learning (semi-supervised learning): the interest of recent studies is semi-supervised learning. During splitter training, information about labeled and unlabeled data is developed. It has been found that adding unlabeled data can effectively improve the performance of the classifier. The observed result is that using an independent measure instead of the confidence level of the classifier itself to identify unlabeled data will greatly improve the detector.

 

Structured: in computer vision research, due to the Association of time and space, data is basically impossible to be independent. The data with dependency of tag information is called structured data. That is, if the tag of a sample is restricted to the tags of other samples, the data is structured.

Constrains: This article proposes a new method for learning from structured unlabeled data. The structures in the data are developed through the so-called "positive-structured constraints" and "negative-structured constraints", which force tags to be determined by unlabeled sample sets.

The positive constraint specifies the positive flag of the sample that meets the constraints, and the negative Constraint specifies the negative flag of the sample that meets the constraints.

The parallel use of positive and negative constraints proves that the combination of positive and negative constraints can compensate each other for their respective errors.

These constraints run on the entire unlabeled sample set, so the obtained information is different from the classifier that only runs on the labeled sample.

 

Learning Strategies: a small number of labeled samples and a large number of structured unlabeled samples generate the following learning strategies:

(I)              Train an initial classifier using labeled samples and adjust predefined Constraints Based on labeled data.

(Ii)            Tag unlabeled data based on the initial classifier, and then re-Identify the labeled samples that conflict with the constraints.

(Iii)          Correct the tags of these samples, add them to the training set, and re-train the classifier.

This document describes the process of this classifier (bootstrapping process)Known as P-N Learning:

3. Online Learning target detector from video data:
Strategy: we consider type of real-time detectors that are based on a scanning window strategy. the input image is scanned into SS position and scales, at each sub-window a binary classifier decides about presence of the object.

The randomized forest classifier: consists of a number of ferns (simplified trees)

 

Each fern performs a series of calculations on the input patch to generate a feature vector pointing to a specific leaf node. The posterior probability PR (y = 1 | xi) is calculated ).

Each leaf node records the number of positive samples P and N in the training process, and uses the maximum likelihood estimation (Maximum Likelihood Estimator) calculate the probability PR (y = 1 | xi) = p/(n + p). If the leaf node is empty, it is set to zero.

Features: 2-bit binary patterns.

These features measure the direction gradient within a definite area and quantify the possible encoding. 2bitbp is inspired by the standard HSV, and is different from the local code. The 3x3 pixels around the local code are encoded to indicate the information of a specific region, while 2bitbp only describes a region through a single encoding. Because of this, the guid must be 8 bits (256
And 2bitbp only needs 2bit (4 codes ).

Initialize the classifier in the first frame, and apply the selected image block to an affine transform to generate a training Initial classifier for 300 positive samples. Evaluate the classifier on the full graph, and use the detection target far away from the target location as the negative sample to update the initial classifier. This method is used to extract negative samples based on PR (y = 1)> PR (y =-1 ).

 

Constraints used (constraints ):

  The structured time and space information of the target data is concentrated on the target trajectory. When the correct trajectory is confirmed, the constraints are reliable and effective.

(I)              Define the confidence level of the tracker as the NCC (normalized cross correlation) between the tracked image block and the selected image block in the first frame ).

(Ii)           If the confidence level of the last frame of the trajectory is higher than 80%, the trajectory is correct.

(Iii)         Apply the P-N constraints on the basis of the correct trajectory.

 

References:

 

[1] P-N learning: bootstrapping binary classifiers by structural constraints (2010_cvpr)

Zdenek kalalUniversity of Surrey etc.

 

[6] forward-backward error: automatic detection of tracking failures (2010_icpr)

Zdenek kalalUniversity of Surrey etc.

 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.