PCL Point Cloud Registration (1)

Source: Internet
Author: User
Tags class definition rand
In the fields of reverse engineering, computer vision, and digitization of cultural relics, due to the incomplete point cloud, rotation dislocation, translation dislocation, etc., so that the complete point cloud to be obtained needs to register the local point cloud, in order to get the complete data model of the object being measured, we need to determine a suitable coordinate system, Merging point sets from various perspectives into a unified coordinate system to form a complete point cloud, which can then be easily visualized, is the registration of point cloud data. Point Cloud Registration has the registration of the manual registration depends on the instrument, and automatic registration, point cloud automatic registration technology is through a certain algorithm or statistical law to calculate the dislocation between two point clouds, so as to achieve two point cloud automatic registration effect, the essence is the different coordinate system measured in the data point cloud coordinate system transformation, In order to get the whole data model, the key of the problem is how to get the parameters R (rotation matrix) and T (translation vector) of the coordinate transformation, so that the three-dimensional data measured at two angles of view is the smallest distance after the coordinate transformation, and the current registration algorithm can be divided into the whole registration and the local registration according to the process. There is a separate registration module in the PCL, which realizes the basic data structure of registration and the classical registration algorithm such as ICP.

Implementation of registration algorithms and related concepts in PCL

22 Introduction to Registration: The registration problem for a pair of point cloud Datasets is 22 registration (pairwise registration or pair-wise registration). Usually by applying an estimate of the resulting representation of the translation and optional 4 Cylinder block transformation matrix to make the data set of one point cloud perfectly accurate with another point cloud dataset (target dataset)

The specific implementation steps:

(1) First extract key points from two data sets according to the criteria selected at the same key point

(2) Calculate the feature descriptors for all the key points selected

(3) According to the coordinate position of the characteristic descriptor in two data sets, based on the similarity between the characteristics and the position of the two, the corresponding relation is estimated, and the corresponding point pair is preliminarily evaluated.

(4) Assuming that the data is noisy, go out to the corresponding point of the error that affects the registration

(5) Using the remaining correct correspondence relation to estimate the rigid body transformation, complete registration.

Correspondence estimate (correspondences estimation): Suppose we've got the two sets of eigenvectors that are derived from the point cloud data of this scan, based on which we have to find, similar features, and then determine the overlapping parts of the data before registration can be Depending on the type of feature PCL uses different methods to search for correspondence between features

When point matching is used, the coordinates of the point's XYZ are used as the eigenvalues for different processing strategies for ordered point clouds and unordered point cloud data:

(1) Exhaustive registration (brute force matching)

(2) kd--number nearest neighbor query (FLANN)

(3) Find in the image space of the ordered point cloud data

(4) Find in the index space of the unordered point cloud data

(3) Removal of correspondence (correspondence rejection)

Due to the effect of noise, not all of the estimated correspondence is correct, because the wrong correspondence has a negative effect on the estimation of the final rigid-body transformation matrix, so it is necessary to remove them, either by random sampling consistency estimation or by other methods to eliminate the wrong correspondence. Finally, the number of correspondence relation is used only by a certain proportion, which can improve the estimation of the transformation matrix.

(4) Estimation of transformation matrices (transormation estimation)

The steps to estimate the corresponding matrix are as follows

1. Evaluating some of the wrong metrics on the basis of correspondence

2. Estimating a rigid body transformation under camera pose (motion estimation) and minimization error metrics

3. Optimize the structure of the point

4 using a rigid body transform to rotate/pan the source into the same coordinate system as the target, and to operate an internal ICP loop with all points, a subset of points, or a key point

5, iterate until the convergence criterion is met.

(5) iterative nearest points algorithm (iterative CLosest point abbreviation ICP algorithm)

ICP algorithm treats stitching 2 point cloud, first establishes corresponding point set P and Q according to certain criteria, which corresponds to the number of point pairs, then calculates the optimal coordinate transformation by the least multiplication iteration, namely the rotation matrix R and the vector T, makes the error function is the smallest, the ICP processing flow divides into four main steps:

1. Sampling Raw Point cloud data

2. Determine the initial set of corresponding points

3, remove the wrong corresponding point pair

4. Solving the coordinate transformation

Related Introduction to PCL classes

Class pcl::correspondencegrouping< Pointmodelt, Pointscenet >
Abstract base class for correspondence Grouping algorithms
Class pcl::geometricconsistencygrouping< Pointmodelt, Pointscenet >
Class Implementing a 3D correspondence grouping enforcing geometric consistency among feature correspondences.
Class Pcl::recognition::houghspace3d
Houghspace3d is a 3D voting space.
Class pcl::hough3dgrouping< Pointmodelt, Pointscenet, Pointmodelrft, Pointscenerft >
Class Implementing a 3D Correspondence grouping algorithm that can deal with multiple instances of a model template found into a given scene.
Class pcl::crhalignment< Pointt, Nbins_ >
Crhalignment uses the histograms (CRH) to find the roll rotation that aligns both views.
Class Pcl::recognition::objrecransac::output
This is an output item of the Objrecransac::recognize () method. More ...
Class Pcl::recognition::objrecransac::orientedpointpair
Class Pcl::recognition::objrecransac::hypothesiscreator
Class Pcl::recognition::objrecransac
This is a ransac-based 3D object recognition method.
Class Pcl::recognition::orroctree::node::D ATA
Class Pcl::recognition::orroctree::node
Class Pcl::recognition::orroctree
That's a very specialized and simple Octree class.
Class Pcl::recognition::rotationspace
This was a class for a discrete representation of the rotation space based on the axis-angle representation.

Example Analysis:

(1) How to use the iterative nearest point algorithm: using the ICP iterative nearest point algorithm in code, the program randomly generates a point and acts as the source point cloud, translates it along the x-axis as the target point cloud, and then uses the ICP to estimate the source to the target of the rigid body transform the orange, the middle of all the information is printed out

New file

Iterative_closest_point.cpp

#include <iostream>//Standard input Output header file #include <pcl/io/pcd_io.h>//i/o operation header file #include &LT;PCL /point_types.h>//Point type definition header file #include <pcl/registration/icp.h>//ICP registration class related header file int main (int argc, char** ARGV) {//Create two PCL::P OINTCLOUD&LT;PCL::P ointxyz> share pointers and initialize them PCL::P OINTCLOUD&LT;PCL::P ointxyz>::P tr cloud_in (
  New PCL::P OINTCLOUD&LT;PCL::P ointxyz>);

  PCL::P OINTCLOUD&LT;PCL::P ointxyz>::P tr cloud_out (new PCL::P OINTCLOUD&LT;PCL::P ointxyz>);               Random fill point cloud cloud_in->width = 5;               Set point cloud width cloud_in->height = 1;
  Set point cloud to unordered point cloud_in->is_dense = false;
  Cloud_in->points.resize (Cloud_in->width * cloud_in->height); for (size_t i = 0; i < cloud_in->points.size (); ++i) {cloud_in->points[i].x = 1024x768 * RAND ()/(Rand_max
    + 1.0f);
    CLOUD_IN-&GT;POINTS[I].Y = 1024x768 * RAND ()/(Rand_max + 1.0f);
  CLOUD_IN-&GT;POINTS[I].Z = 1024x768 * RAND ()/(Rand_max + 1.0f);
}  Std::cout << "Saved" << cloud_in->points.size () << "data points to Input:"//print at the total number of point clouds <&l T

  Std::endl; for (size_t i = 0; i < cloud_in->points.size (); ++i) std::cout << "" <<//print at actual coordinates cloud_in ->points[i].x << "<< cloud_in->points[i].y <<" "<< cloud_in->points[i].z <& Lt
  Std::endl;
  *cloud_out = *cloud_in;

   Std::cout << "Size:" << cloud_out->points.size () << Std::endl; Implement a simple point-cloud rigid-body transformation to construct the target point cloud for (size_t i = 0; i < cloud_in->points.size (); ++i) cloud_out->points[i].x = Clou
  d_in->points[i].x + 0.7f;
  Std::cout << "transformed" << cloud_in->points.size () << "data points:" << Std::endl; for (size_t i = 0; i < cloud_out->points.size (); ++i)//print constructed target point cloud std::cout << << cl oud_out->points[i].x << "<< cloud_out->points[i].y <<""<< cloud_out->points[i].z << Std::endl;   PCL::ITERATIVECLOSESTPOINT&LT;PCL::P ointxyz, PCL::P ointxyz> ICP;                 Create a Iterativeclosestpoint object Icp.setinputcloud (cloud_in);               Cloud_in is set to point cloud Source point Icp.setinputtarget (cloud_out);         The cloud_out is set to match the target with cloud_in PCL::P OINTCLOUD&LT;PCL::P ointxyz> Final;                             Store the point Cloud Icp.align (Final) after the registration transformation Point cloud; Print registration related input information std::cout << "has converged:" << icp.hasconverged () << "score:" << Icp.getfitne
  Ssscore () << Std::endl;

 Std::cout << icp.getfinaltransformation () << Std::endl;
return (0); }

Results of compilation run

Can have the test results to see the transformed point cloud is only the value of the x-axis added a fixed value of 0.7, and then by the target point Cloud and the source point of cloud computing its rotation and translation, it is obvious that the last line of the X value of 0.7

At the same time, we can change the program ourselves to observe different experimental results.

For two images through the ICP to find its transformation:

At first, if you get the data directly through the Kinect, the following error occurs because the ICP algorithm cannot process point cloud data containing nans, so it needs to be removed by removing these points in order to be an input to the ICP algorithm.

Incorrect hints

So we have to pass the previously learned code to remove the invalid point cloud, and then as the input, because it is my own point cloud data, data is not preprocessed, where the input of the two point cloud after the ICP results and visualized as

(2) How to gradually match multiple point clouds

This example uses the iterative nearest point algorithm to gradually achieve a 22 match on a series of point clouds, and his idea is to transform all the point clouds so that all of them are in a unified coordinate system with the first point cloud, to find the best transformation between each coherent overlapping point cloud, and accumulate these transformations to all point clouds, The point cloud that can perform the ICP algorithm requires a rough pre-match (for example, within a robot's distance or within a map's frame), and a point cloud needs to overlap with another point cloud.

New File Pairwise_incremental_registration.cpp

#include <boost/make_shared.hpp>//boost pointer-related header files #include <pcl/point_types.h>//Points         Type definition header file #include <pcl/point_cloud.h>//Point cloud Class definition header file #include <pcl/point_representation.h>     Point represents the associated header file #include <pcl/io/pcd_io.h>//pcd File Open Storage class header file #include <pcl/filters/voxel_grid.h>        //filter header file for voxel meshing #include <pcl/filters/filter.h>//filter Related header file #include < Pcl/features/normal_3d.h>//Normal feature header file #include <pcl/registration/icp.h>//ICP class related header file #include < Pcl/registration/icp_nl.h>//non-linear ICP related header file #include <pcl/registration/transforms.h>//transformation matrix header file #includ e <pcl/visualization/pcl_visualizer.h>//visual class header file using Pcl::visualization::
Pointcloudcolorhandlergenericfield;

Using pcl::visualization::P ointcloudcolorhandlercustom;
Define typedef PCL::P OINTXYZ Pointt;   typedef PCL::P ointcloud<pointt> Pointcloud; Statement PCL::P OINTXYZ data typedef PCL::P Ointnormal Pointnormalt;

typedef PCL::P ointcloud<pointnormalt> pointcloudwithnormals;     Declare a global visual object variable, define the left and right viewpoints to display the result point cloud after registration and registration respectively Pcl::visualization::P Clvisualizer *p; Create visual objects     int vp_1, vp_2;                          //define the ID that stores the left and right viewpoints//Declare a structure convenient to point cloud  Processing and managing point clouds with file name and Point cloud objects, the input struct PCD {pointcloud::P tr Cloud, which can accept multiple point cloud files simultaneously during processing;   Point Cloud shared pointer std::string f_name;
File name PCD (): Cloud (New Pointcloud) {};

}; struct Pcdcomparator//File comparison processing {bool operator () (const pcd& P1, const pcd& p2) {return (P1.f_name <
  P2.f_name);


}
}; Define a new point in < X, y, Z, curvature > form to represent Class Mypointrepresentation:public PCL::P ointrepresentation <pointnormalt > {using PCL::P ointrepresentation<pointnormalt>::nr_dimensions_; public:mypointrepresentation () {NR    _dimensions_ = 4; Define the dimension of a point}//Overload Copytofloatarray method converts a point to a four-dimensional array Virtual void Copytofloatarray (const pointnormalt &p, float * out) const {//< X, y, Z, curvature > out[0]
    = p.x;
    OUT[1] = p.y;
    OUT[2] = p.z;
  OUT[3] = p.curvature;

}
}; /** \ Left view is used to display unmatched source and target point clouds */void Showcloudsleft (const pointcloud::P TR cloud_target, const Pointcloud::P tr cloud_source) {p
  ->removepointcloud ("Vp1_target");

  P->removepointcloud ("Vp1_source");
  Pointcloudcolorhandlercustom<pointt> tgt_h (cloud_target, 0, 255, 0);
  Pointcloudcolorhandlercustom<pointt> Src_h (cloud_source, 255, 0, 0);
  P->addpointcloud (Cloud_target, Tgt_h, "Vp1_target", vp_1);

  P->addpointcloud (Cloud_source, Src_h, "Vp1_source", vp_1);
  Pcl_info ("Press Q to begin the registration.\n");
P-> spin (); /** \ Right shows the source and target point cloud */void Showcloudsright (const pointcloudwithnormals::P TR cloud_target, const pointcloudwithnormal
  S::P tr cloud_source) {P->removepointcloud ("source");


  P->removepointcloud ("target"); PointcloudcolorhandlErgenericfield<pointnormalt> Tgt_color_handler (Cloud_target, "curvature");

  if (!tgt_color_handler.iscapable ()) Pcl_warn ("Cannot create curvature color handler!");
  Pointcloudcolorhandlergenericfield<pointnormalt> Src_color_handler (Cloud_source, "curvature");


  if (!src_color_handler.iscapable ()) Pcl_warn ("Cannot create curvature color handler!");
  P->addpointcloud (Cloud_target, Tgt_color_handler, "target", vp_2);

  P->addpointcloud (Cloud_source, Src_color_handler, "source", vp_2);
P->spinonce (); }/////////////////////////////////////////////////////////////////////////////////** \brief Load A set of PCD files th At we want to register together * \param argc the number of arguments (pass from main ()) * \param argv the actual com Mand line arguments (pass from main ()) * \param models the resultant vectors of Point cloud Datasets */void LoadData (int argc, char **argv, STD::VECTOR&LT;PCD, eigen::aligned_allocator<pcd>;
  &models) {std::string extension (". PCD");
    The first argument is the command itself, so you start with the second argument for (int i = 1; i < argc; i++) {std::string fname = std::string (Argv[i]);

    The PCD file name is at least 5 character size string (because the suffix name. PCD already occupies four character positions) if (fname.size () <= extension.size ()) continue;
    Std::transform (Fname.begin (), Fname.end (), Fname.begin (), (Int (*) (int)) tolower);
       Check if the parameter is a PCD suffix of the file if (Fname.compare (Fname.size ()-extension.size (), extension.size (), extension) = = 0) {

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.