This article is from: http://cn.mathworks.com/help/vision/examples/3-d-point-cloud-registration-and-stitching.html
3D Point Cloud Registration and flattening plays an important role in point cloud processing, such as the series of point cloud data that can be used to get Kinect to work together to build a larger range of 3D scenes.
This paper applies the ICP algorithm to the registration between point clouds, which can be used to build 3D model of object and build 3D map of real world, which is called slam--simultaneous localization and mapping in the academic way.
3D Point Cloud Registration
The above image is the specific process of the ICP for Point Cloud registration, the specific code is as follows:
DataDir = FullFile (Toolboxdir (' vision '), ' visiondata ', ' Livingroom.mat ');
Load (datadir);
% Extract, consecutive point clouds, and use the first point cloud as
% reference.
Pccloudref = livingroomdata{1};
Ptcloudcurrent = livingroomdata{2};
GridSize = 0.1;
Fixed = Pcdownsample (pccloudref, ' gridaverage ', gridSize);
moving = Pcdownsample (ptcloudcurrent, ' gridaverage ', gridSize);
% Note that the downsampling step does does not have the speed up of the registration,
% but can also improve the accuracy.
First, read the reference point Cloud (Reference,target) and the target point Cloud (current,object) data, respectively, in the above code corresponding to Ptcloudref and ptcloudcurrent, remove the external solitary point (optional, slower) drop sampling, The above code is based on the method of grid sampling (in addition, MATLAB point cloud processing comes with the random method, and it is more recommended to use random). Mesh sampling The general idea is that the piece point cloud data is divided into a cube grid, each grid output a 3D data point, the location of the point is a bit of the grid in the average.
Tform = Pcregrigid (moving, fixed, ' Metric ', ' pointtoplane ', ' extrapolate ', true);
ptcloudaligned = Pctransform (ptcloudcurrent,tform);
After the above processing, we can calculate the transformation relationship between reference point cloud and current point cloud through ICP algorithm, and transform the current point cloud into the coordinate system of reference point cloud;
Mergesize = 0.015;
Ptcloudscene = Pcmerge (Pccloudref, ptcloudaligned, mergesize);
% visualize the input images.
Figure
Subplot (2,2,1);
Imshow (Pccloudref.color);
Title (' first input image ');
Drawnow;
Subplot (2,2,3);
Imshow (Ptcloudcurrent.color);
Title (' Second input image ');
Drawnow;
% visualize the world scene.
Subplot (2,2,[2,4]);
Showpointcloud (Ptcloudscene, ' Verticalaxis ', ' Y ', ' verticalaxisdir ', ' down ');
Title (' Initial World Scene ');
Xlabel (' X (M) ');
Ylabel (' Y (M) ')
Zlabel (' Z (M) ')
Drawnow;
When two point clouds are flattened together, overlapping areas can also be filtered using a grid
The image above shows the results of the two real-time indoor 3D point cloud data being flattened together.
3D Point Cloud Flattening
To form a larger range of 3D scenes, you need to repeat the process to process a series of point cloud data. Use the first point cloud to establish a reference coordinate system and change each of the other point cloud data into that coordinate system.
% Store The transformation object that accumulates the transformation.
Accumtform = Tform;
Figure haxes = Showpointcloud (ptcloudscene, ' Verticalaxis ', ' Y ', ' verticalaxisdir ', ' down ');
Title (' Updated World Scene ');
% Set the Axes property for faster rendering Haxes.cameraviewanglemode = ' auto ';
Hscatter = Haxes.children;
For i = 3:length (livingroomdata) ptcloudcurrent = livingroomdata{i};
% use previous moving point cloud as reference.
fixed = moving;
moving = Pcdownsample (ptcloudcurrent, ' gridaverage ', gridSize);
% Apply ICP registration.
Tform = Pcregrigid (moving, fixed, ' Metric ', ' pointtoplane ', ' extrapolate ', true);
% Transform The current point cloud to the reference coordinate system% defined by the first point cloud. Accumtform = Affine3d (tform.
T * accumtform.t);
ptcloudaligned = Pctransform (ptcloudcurrent, accumtform);
% Update the world scene.
Ptcloudscene = Pcmerge (Ptcloudscene, ptcloudaligned, mergesize); % visualize thE World scene.
Hscatter.xdata = Ptcloudscene.location (:, 1);
Hscatter.ydata = Ptcloudscene.location (:, 2);
Hscatter.zdata = Ptcloudscene.location (:, 3);
Hscatter.cdata = Ptcloudscene.color;
Drawnow (' Update '); End% During The recording, the Kinect was pointing downward.
To visualize the% of result more easily, let's transform the data so, the ground plane are% parallel to the x-z plane.
angle =-PI/10;
A = [1,0,0,0; 0, cos (angle), sin (angle), 0;
... 0,-sin (angle), cos (angle), 0;
...
0 0 0 1];
Ptcloudscene = Pctransform (Ptcloudscene, Affine3d (A));
Showpointcloud (Ptcloudscene, ' Verticalaxis ', ' Y ', ' verticalaxisdir ', ' Down ', ...
' Parent ', haxes);
Title (' Updated World Scene ');
Xlabel (' X (M) '); Ylabel (' Y (M) ') Zlabel (' Z (m) ')
The transformation matrix can be used for the initial value of the next transformation matrix to accelerate the transformation process.