This article from http://blog.csdn.net/hellogv/, reference must indicate the source!
I am free recently. I will continue to write some articles on computer vision, which is purely entertaining for new amateurs...
I have talked about how to use aforge to implement multi-point Motion Tracking (aforge-based Gesture Recognition 3 ~~~ Multi-Point Gesture Recognition), but it is implemented on the PC platform. Previously, it also implemented the opencv library porting on WM/wince, so this time we tried to implement action tracking on the WM/wince platform, and the Code was changed from the built-in camshiftdemo of opencv. c example.
This article first implements single-frame camshift recognition. The code in this article can be downloaded here:
Select the green part of the image, and the program will automatically circle the identified part in red.
Next, let's talk about the principle of the program:
1. Calculate the color histogram of the target to be tracked and separate the hue from HSV.
2. Use hue to create a histogram of Tracked colors and generate a reverse projection image space
3. Use mask to highlight the tracing color in the reverse projection image space
4. Search and calculate the range of the tracing color.
The subsequent computation results of running a single-frame recognition program on a PC can help you understand the principle of the Code:
Result After the source image is passed to HSV through RGB
Hue separated from HSV
Use the histogram created by hue to calculate the selected color (green in the original image)
Perform histogram reverse projection on the hue image, and then mask the image to obtain the desired part.
Use cvcamshift to search for the highlighted parts after mask processing and calculate the range.