OpenCV learning completely based on the OpenCV of binocular depth of vision and the realization of ranging __OPENCV

Source: Internet
Author: User
Tags scalar sprintf
Directory

Directory Description Binocular Ranging principle OPENCV realization of binocular ranging principle binocular ranging code description of the code and implementation of binocular ranging next

1 Description

Afraid to forget later, now summed up some time ago has been made, about binocular vision of things.
The principle of binocular vision there are many online, I simply record my understanding of this.
The specific implementation is mainly reference to the great God's blog:
Http://blog.csdn.net/chenyusiyuan/article/list/1
And these two posts:
http://blog.csdn.net/sunanger_wang/article/details/7744015
http://blog.csdn.net/scyscyao/article/details/5443341

Operating Environment:
1.windows10
2.OPENCV 2.4.9
3.visual Studio 2013
4. Two Microsoft HD-3000 camera 2 binocular ranging principle

First of all, to say that the binocular vision of the realization of the principle of understanding, is not guaranteed to be correct:
The first is the classic diagram:

This diagram simply illustrates the basic principle of binocular ranging, which is to find out the distance Z.
In the argument to the right of that equation in the lower right corner Z:
F is each camera's own focal length, which is the distance between the sensor and the lens.
T is the distance between two camera lenses, these are OK.
D is indeterminate, D is the image of an object on the two sensors, that is, the distance between XL and XR, is a variable.

So in order to get the distance, we need to obtain the value of D each time, then we can find Z according to the similarity triangle principle.

Now the question is, why is the above picture, the camera sensor and lens position is reversed.
Look at this picture:

For the convenience of mathematical processing, the researchers usually use the virtual plane V to replace the imaging plane I, where the virtual plane V is located between the focal plane F and the object, and the imaging plane is symmetrical about the focus plane. principle of binocular ranging by 3 OpenCV

Above is the binocular vision simple principle introduction, the formula does not want to write, does not have much relation with the realization anything.
Principle return principle, use OPENCV to realize binocular vision time is another thing.
The main steps to achieve binocular ranging on OpenCV are:

1. Binocular calibration and calibration, access to the camera parameters matrix:
The parameter matrix of two cameras is calibrated
Cvstereorectify performs binocular correction
Initundistortrectifymap to generate two image correction required pixel mapping matrix
Cvremap two images to be corrected separately

2. Stereo matching to obtain disparity map:
STEREOBM Generating Parallax Map
Pretreatment: Image normalization, reduce brightness difference, enhance texture
Matching process: sliding sad window, matching search along the horizontal line, due to the correction of the left and right picture parallel, the characteristics of the left-hand image can be found in the right-hand map corresponding to the best match
Re-filter: Remove bad match points through Uniquenessratio
Output disparity Map disparity: If the left-right matching point is more dense, the matching point is more, the image obtained with the original original similarity is relatively large, if the matching point is sparse, the point and the original original similarity is smaller

3. Arrive at ranging:

The generated disparity map input reprojectimageto3d () function, generate 3D point cloud, 3D Point Cloud holds the three-dimensional coordinates of 2D image, then read out the value of Z axis in three-dimensional coordinates of each frame image, get distance data. 4 specification of binocular ranging code

My binocular program writing is very simple and common, some key code borrowed from the great God's code, actually said that the code is just some OpenCV function of the use of the way, in general, there is a big difference:

First of all, the program of the Great God is based on MFC, so if you want to transplant to the ARM board or Linux system will be very troublesome, so I have the entire program including image display, disparity map display, distance display, etc. are completely using OPENCV function to achieve.

Furthermore, in the way of outputting distance in the program of the Great God, my understanding is to first detect the contour of the nearest object and then extract the distance coordinates of the contour in the three-dimensional point cloud. But the realization is not very ideal, if the disparity map of the quality is not high, not detect the contour, not trigger this function, let alone distance. So in my program, the way to get distance is to use the mouse point parallax map of a point, it will show the distance of this point, but so far information is not very accurate, there may be some parameters are not done.

It is important to note that for two cameras, there is only one parameter matrix (if the relative position between the two cameras is unchanged), so the calibration process needs only one time. So my program did not calibrate the thing (too troublesome), but read from the outside Calib_ Paras.xml This parameter file, this file can be signed and generated by the code of the Great God, and it should be generated from the calibration toolbox of MATLAB (the link below), but I did not get (matlab generated parameter data do not know how to use).
code and implementation of http://www.vision.caltech.edu/bouguetj/calib_doc/5 binocular ranging

My program is only responsible for opening the camera, display the image, generate parallax map, show disparity map, find out the point cloud, distance, show distance.
That is, he is in the calibration process began, the program has no camera calibration process and functions, so the correct operation is required to calib_paras.xml this file, that is, the calibration of the parameters of the document, can be run by the code of the great God to build after the calibration.
And in order to ensure that you can successfully operate correctly, it is best to ensure that your computer can run the Great God Program (address below).
Https://github.com/yuhuazou/StereoVision

My program:

#include "opencv2/video/tracking.hpp" #include "opencv2/imgproc/imgproc.hpp" #include "opencv2/highgui/highgui.hpp" #include <cv.h> #include <cxmisc.h> #include  

My use of the binocular camera

Front photo

Program to run out of the effect screenshot

Screenshots are the left and right camera images, distance display, and disparity map display.

It's not very accurate to see Parallax maps, the first is because the parameters do not have time to carefully tune, and then the key is always someone to move my camera, spent half a day to calibrate a good, others adept a touch, two cameras relative position changed, the previous calibration of the work is all in vain, then simply ignore it.

Disparity map is not allowed, distance information is certainly not very accurate, most of the situation is 16000, the above figure with the mouse point disparity map in the red that one, the distance display will first output the point of the x,y coordinates, and then the point of the distance coordinates, that is, the value of the z axis.

In general, it has to be adjusted. 6 Next

The above is almost a period of time to get things, this plan to the binocular ranging transplant to Nvidia Jetson TK1 to see the effect, but did not finish, the current process is: The above code has been able to compile on the Tk1 succeeded, is dead or alive can not run, I suspect it is TK that the board is not support to open and display two cameras, this problem to spend some time I think should be able to solve, but now get something else, also lazy to check.

The next thing to do is to get something else, OpenCV's things must be put aside.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.