Three-dimensional reconstruction of PCL interior

Source: Internet
Author: User

There are three prime sensor cameras on hand, fixed at different angles, with the intention of stitching up the point cloud data of three cameras according to RGBD information.

Device limit + capacity is not enough, the point Cloud Library 1.8 environment has not been built up, so it is unable to read the information in real-time. In addition, the laptop USB chip bus Interrupt protocol limit, also cannot use three cameras simultaneously.

In such a pit Dad's situation, share how I do three-dimensional reconstruction ....


This article Environment win7+vs2012+opencv2.4.9+openni2.0+pcl1.7.2


First, point cloud data acquisition

1. Capture depth and RGB data using openni2.0

2.OPENCV Real-time display (otherwise you do not know what is collected), the use of Iplimage interface to copy the RGB data to Cloudpoint (direct cloudpoint between the assignment result is incorrect)

3. Using PCL IO interface, Daoteng data to PCD file (PCD is saved in binary form, ASCII performance is too poor)

#include <pcl/io/pcd_io.h> #include <pcl/point_types.h>//standard library header file # include <iostream> #include < string> #include <vector>//OPENCV header file # include <opencv2\core\core.hpp> #include <opencv2\highgui\ highgui.hpp> #include <opencv2\imgproc\imgproc.hpp>//Openni header file # include <OpenNI.h> typedef unsigned Char uint8_t;//namespaceusing namespace std;using namespace openni;using namespace cv;using namespace Pcl;void checkopen Nierror (status result, string status) {if (Result! = STATUS_OK) cerr << Status << "Error  : "<< openni::getextendederror () << Endl;  } int main (int argc, char **argv) {Status result = Status_ok;int i,j;float x=0.0,y=0.0,z=0.0,xx=0.0; Iplimage *test,*test2;iplimage *test2;//point Cloud pointcloud<pointxyzrgb> cloud;//opencv imageMat cvBGRImg;  Mat cvdepthimg;      OpenNI2 image Videoframeref onidepthimg;      Videoframeref Onicolorimg;namedwindow ("depth"); Namedwindow ("ImaGE "); Char key=0;//initialization Openni result = Openni::initialize ();     Checkopennierror (Result, "initialize context");      Open device device device; result = Device.open (Openni::any_device);    Checkopennierror (result, "open device");      Create depth stream videostream onidepthstream; result = Onidepthstream.create (device, openni::sensor_depth);      Checkopennierror (result, "create depth stream");      Set depth video mode Videomode modedepth;      Modedepth.setresolution (640, 480);      Modedepth.setfps (30);      Modedepth.setpixelformat (PIXEL_FORMAT_DEPTH_1_MM);      Onidepthstream.setvideomode (modedepth); Start depth Stream result = Onidepthstream.start ();       Checkopennierror (Result, "start depth stream");      Create color stream videostream onicolorstream;  result = Onicolorstream.create (device, Openni::sensor_color);    Checkopennierror (result, "Create color stream"); Set color video mode Videomode modeColor;      Modecolor.setresolution (640, 480);      Modecolor.setfps (30);      Modecolor.setpixelformat (pixel_format_rgb888); Onicolorstream.setvideomode (Modecolor); Start color Stream result = Onicolorstream.start (); Checkopennierror (result, "Start color Stream"); while (true) {//Read frame if (Onicolorstream.readframe (&oni colorimg) = = STATUS_OK) {//Convert data into OpenCV type Mat cvrgbimg (onicolorimg.              GetHeight (), Onicolorimg.getwidth (), CV_8UC3, (void*) onicolorimg.getdata ());              Cvtcolor (cvrgbimg, cvbgrimg, CV_RGB2BGR);          Imshow ("image", cvbgrimg); } if (Onidepthstream.readframe (&onidepthimg) = = STATUS_OK) {Mat cvrawimg16u (onidepthimg.ge              Theight (), Onidepthimg.getwidth (), CV_16UC1, (void*) onidepthimg.getdata ());                Cvrawimg16u.convertto (cvdepthimg, cv_8u, 255.0/(Onidepthstream.getmaxpixelvalue ())); Imshow ("depth", CVDEPTHIMG); }//Quit if (Cv::waitkey (1) = = ' Q ') break;//capture depth and color data if (CV:: Waitkey (1) = = ' C ') {//get datadepthpixel *pdepth = (depthpixel*) onidepthimg.getdata ();//create point cloudcloud.width = Onidepthimg.getwidth (); cloud.height = Onidepthimg.getheight (); cloud.is_dense = False;cloud.points.resize ( Cloud.width * cloud.height);//test = Cvcreateimage (Cvsize (cloud.width,cloud.height), ipl_depth_8u,3); test2 = & Iplimage (CVBGRIMG), for (I=0;i<onidepthimg.getheight (), i++) {for (J=0;j<onidepthimg.getwidth (); j + +) {Float k = i   ;  float m = j; xx = Pdepth[i*onidepthimg.getwidth () +j];  Coordinateconverter::convertdepthtoworld (ONIDEPTHSTREAM,M,K,XX,&AMP;X,&AMP;Y,&AMP;Z); cloud[i*cloud.width+j].x = x; Cloud[i*cloud.width+j].y = y; Cloud[i*cloud.width+j].z = XX; cloud[i*cloud.width+j].b = (uint8_t) test2->imagedata[i*test2->widthstep+j*3+0]; CLOUD[I*CLOUD.WIDTH+J].G = (uint8_t) test2->imagedata[i*test2->widthstep+J*3+1]; CLOUD[I*CLOUD.WIDTH+J].R = (uint8_t) test2->imagedata[i*test2->widthstep+j*3+2];/* test->imageData[i* TEST-&GT;WIDTHSTEP+J*3+0] = test2->imagedata[i*test2->widthstep+j*3+0]; TEST-&GT;IMAGEDATA[I*TEST-&GT;WIDTHSTEP+J*3+1] = test2->imagedata[i*test2->widthstep+j*3+1]; TEST-&GT;IMAGEDATA[I*TEST-&GT;WIDTHSTEP+J*3+2] = test2->imagedata[i*test2->widthstep+j*3+2];*/}}// Cvsaveimage ("test.jpg", test);p cl::io::savepcdfilebinarycompressed ("TEST_PCDC.PCD", Cloud);cerr<< "Saved" <<cloud.points.size () << "data points to test_pcd.pcd." <<endl;imwrite ("C_color.jpg", cvbgrimg); Imwrite ("C_depth.jpg", cvdepthimg);/*for (size_t i=0;i< Cloud.points.size (); ++i) cerr<< "" <<cloud.points[i].x<< "" <<cloud.points[i].y<< "" <<cloud.points[i].z<<endl;*/}}}

Press the C key, get the point cloud information, press the P key to exit

We get the following three sets of data:




Second, point cloud display

This is nothing, just read the PCD directly and call show.

#include <pcl/visualization/cloud_viewer.h> #include <iostream> #include <pcl/io/io.h> #include <pcl/io/pcd_io.h>    int Main () {PCL::P OINTCLOUD<PCL::P ointxyzrgb>::P TR Cloud (New PCL::P ointcloud< PCL::P ointxyzrgb>); if (PCL::IO::LOADPCDFILE<PCL::P ointxyzrgb> ("TEST_PCDA.PCD", *cloud) ==-1)//* Open Point cloud file { Pcl_error ("couldn ' t read file test_pcd.pcd\n"); return (-1);} std::cout<< "Loaded" <<cloud->width*cloud->height<< "data points from TEST_PCD.PCD with the Following fields: "<<std::endl;pcl::visualization::cloudviewer viewer (" My first Cloud viewer "); Viewer.showcloud (cloud); while (!viewer.wasstopped ()) {}}

Three sets of data correspond to the result:





Third, point cloud splicing

I directly take the original data for ICP processing, so the speed is very slow! About ICP interpretation, excerpt from the "Point Cloud Library PCL Learning Tutorial"

Iterative Closest Point, iterative nearest algorithm, treats the 2-point cloud of stitching, first establishes the corresponding point set p and Q according to certain criteria, in which the number of corresponding point pairs is n. Then the optimal coordinate transformation is computed by the least squares iteration, that is, the rotation matrix R and the vector T, which makes the error function minimum. The ICP algorithm is simple and intuitive and can make the stitching have better precision, but the speed of the algorithm and the convergence to the global optimal are largely dependent on the given initial transformation estimation and the corresponding relationship in the iterative process. All kinds of coarse stitching technology can provide better initial position for ICP algorithm, so it is important to establish correct corresponding point set in iterative process so as to avoid the iteration falling into local extremum, which determines the convergence speed and the final stitching precision of the algorithm.

ICP Processing process:

1. Sampling Raw Point cloud data

2. Determine the initial set of corresponding points

3. Remove the wrong corresponding point pair

4. Solving the coordinate transformation

#include <iostream> #include <pcl/io/pcd_io.h> #include <pcl/point_types.h> #include <pcl/ registration/icp.h> #include <pcl/visualization/cloud_viewer.h> #include <pcl/kdtree/kdtree_flann.h  > #include <time.h>int main (int argc, char** argv) {clock_t start,finish;   Double totaltime;  PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb>::P tr cloud_in (new PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb>);  PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb>::P tr cloud_out (new PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb>);  PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb>::P tr Cloud_out2 (new PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb>);  PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb>::P tr My_cloud (new PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb>);    PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb>::P tr my_cloud2 (new PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb>);  Start=clock (); if (PCL::IO::LOADPCDFILE&LT;PCL::P ointxyzrgb> ("TEST_PCDA.PCD", *cloud_in) ==-1)//* Open Point cloud file {pcl_error ("Couldn ' t Read file test_pcd.pcd\n "); ReturN (-1);  } finish=clock ();  TotalTime = (double) (Finish-start)/clocks_per_sec; cout<< "\ n load TEST_PCDA.PCD data:" <<totaltime<< "seconds!"  <<endl;  Start=clock (); if (PCL::IO::LOADPCDFILE&LT;PCL::P ointxyzrgb> ("TEST_PCDB.PCD", *cloud_out) ==-1)//* Open Point cloud file {pcl_error ("Couldn ' t  Read file test_pcd.pcd\n "); return (-1);  } finish=clock ();  TotalTime = (double) (Finish-start)/clocks_per_sec; cout<< "\ n load TEST_PCDB.PCD data:" <<totaltime<< "seconds!"  <<endl;  Start=clock (); if (PCL::IO::LOADPCDFILE&LT;PCL::P ointxyzrgb> ("TEST_PCDC.PCD", *cloud_out2) ==-1)//* Open Point cloud file {pcl_error ("Couldn ' t  Read file test_pcd.pcd\n "); return (-1);  } finish=clock ();  TotalTime = (double) (Finish-start)/clocks_per_sec; cout<< "\ n load TEST_PCDC.PCD data:" <<totaltime<< "seconds!"  <<endl;  Call ICP API Start=clock ();  PCL::ITERATIVECLOSESTPOINT&LT;PCL::P Ointxyzrgb, PCL::P ointxyzrgb> ICP;  Icp.setinputcloud (cloud_in); Icp.setinputtarget (ClouD_out);  PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb> Final;  Icp.align (Final); Std::cout << "has converged:" << icp.hasconverged () << "score:" << Icp.getfitnessscore () <&l T  Std::endl;    Std::cout << icp.getfinaltransformation () << Std::endl;  Finish=clock ();  TotalTime = (double) (Finish-start)/clocks_per_sec; cout<< "\ n first time call ICP process:" <<totaltime<< "seconds!"  <<endl; Construct stitching temporary point cloud for (int i=0;i< Final.points.size (); i++) {PCL::P ointxyzrgb basic_point;basic_point.x = final.points[i]. X;basic_point.y = Final.points[i].y;basic_point.z = FINAL.POINTS[I].Z;BASIC_POINT.R = Final.points[i].r;basic_  POINT.G = final.points[i].g;basic_point.b = Final.points[i].b;my_cloud->points.push_back (basic_point);  }//call ICP API another Time Start=clock ();  Icp.setinputcloud (CLOUD_OUT2);  Icp.setinputtarget (My_cloud);  PCL::P OINTCLOUD&LT;PCL::P ointxyzrgb> Final2;  Icp.align (FINAL2); Std::cout << "has converged:"<< icp.hasconverged () <<" score: "<< icp.getfitnessscore () << Std::endl;  Std::cout << icp.getfinaltransformation () << Std::endl;  Finish=clock ();  TotalTime = (double) (Finish-start)/clocks_per_sec; cout<< "\ Second time call ICP process:" <<totaltime<< "seconds!"  <<endl;   My_cloud.reset (); Fabric Stitching final point cloud for (int i=0;i< Final2.points.size (); i++) {PCL::P ointxyzrgb basic_point;basic_point.x = final2.points[i ].x;basic_point.y = Final2.points[i].y;basic_point.z = FINAL2.POINTS[I].Z;BASIC_POINT.R = Final2.points[i].r;basic_  POINT.G = final2.points[i].g;basic_point.b = Final2.points[i].b;my_cloud2->points.push_back (basic_point);  } Pcl::visualization::cloudviewer Viewer ("My first Cloud viewer");  Viewer.showcloud (MY_CLOUD2); while (!viewer.wasstopped ()) {} return (0);}



The result of the operation seems to be not the same, Convergence failed (three points cloud image between the intersection is too small due to: )



A different order is also incorrect:




Iv. Compilation Error Resolution

The code is very simple, but a variety of open source things together on the egg ache, this does not encounter the OpenCV in the Flann module and PCL in the Flann module conflict!

Tell us how I found out ...

Problem phenomenon:


There is a pit daddy structure, ran out!! Do how

Source insight into the point Cloud Library all the source code, and then the Flann third party all the header file Import, association will find this pit daddy structure


And then see who calls it, FLANN.HPP


It is located in the PCL third party directory


To the present position I feel no problem, because the "learning tutorial" does not refer to FLANN.HPP header file, I would like to try to quote will not solve the problem, and later vs2012 Intelligent Association function, helped me a lot!


Since this code does not use the OPENCV third-party library, decisively kill Opencv2 this size file



Everything!



Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.

Three-dimensional reconstruction of PCL interior

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.