A simple calibration method for rotating platform and its realization by MATLAB
Absrtact: The platform calibration is one of the basic measures to ensure the precision of the rotating platform, aiming at the calibration of the center point of the rotating platform, the same coordinate system must be used in the positioning of the vision system, and the lens coordinate system and the motion coordinate system in the vision system are linked together, that is, the calibration
Keywords: rotation center calibration; center position detection; MATLAB implementation;
Background: In the modern industrial automation production, the application of machine vision is increasing day by day, in the industry, for assembly, welding, industrial inspection (such as quality inspection) and other uses of the robot and a variety of automated work system, with the rotary platform of the automatic calibration system, with higher operating accuracy. This paper mainly explores the method of calibrating the rotating center of the rotating platform of the mobile automatic film-laminating machine.
An implementation method
In the rotation center calibration of rotating platform, the image itself is affected by environmental factors or equipment factors, sometimes with more obvious noise, it is necessary to preprocess the camera, then extract the feature edge of the image, locate the edge coordinate of the image, and then fit the edge coordinates linearly. The line equation of the mobile phone edge in the rotation plane is obtained, and the three images taken by the camera in the process of the platform rotation are used to fit three different straight lines, and the distance between the center of rotation of the platform and the straight line is equal.
Flow chart:
实现过程:
2.1 Image preprocessing
Image preprocessing is a series of image processing, so that the image conforms to the requirements of determining the centroid coordinate module, that is, the maximum edge of the image is preserved, in the rotating platform calibration process, the image preprocessing mainly includes: ① image grayscale, color image of each pixel has r , G, b three components, each component has 255 of the value is desirable, so that a pixel can have more than 16 million 255*255*255 color variation range. and gray image is R, G, b three components of a special color image, the range of one pixel change is 255, so in the digital image processing process to transform the image into grayscale image so that the subsequent image calculation less, The description of grayscale image still reflects the distribution and characteristics of the whole and local chroma and luminance levels of the whole image as well as the color image. ② denoising, because the noise of the image is mainly from the rotating platform, that is, the noise caused by the image background, the Gaussian smoothing filter is used to smooth the image and remove noise.
2.2 Image Edge extraction
The biggest advantage of edge extraction is the ability to highlight the edge area, while making the extraneous part of the image dim, based on image pre-processing features, select the canny operator to extract the image edge, The method of detecting edge by canny operator is to find the local maximal value of image gradient, which is calculated by using the derivative of Gaussian filter. The method uses two thresholds to detect both strong and weak edges, and weak edges are included in the output only when weak edges are connected to strong edges. This method is not susceptible to noise disturbances and is capable of detecting true weak edges [1].
2.3 Delete Small area objects
in the extracted edge image, there are also mobile phone buttons, cooling holes, such as small size, and can not be removed by filtering operations such as smaller areas of the object, you need to set a small area of the threshold to delete, leaving the largest area of the phone edge for the next operation.
2.4 Determining centroid coordinates
Determining the extracted image edge requires the system to quickly and accurately determine the centroid coordinates of the image, the centroid coordinates (K,L) available in the binary image:
k= ()/(), l= ()/() (1)
1, (i,j) ∈ Target
In the formula: A (i,j) =
0, (i,j) ∈ Background
(I,J) is the coordinates of the pixels in the image, M,n is the row and column of the image, when the target and the visual system is relative motion, as long as the repetition (1) formula can achieve the target centroid search and tracking [2].
2.5 Determining the extracted image edge
According to the location of the image centroid coordinates (K,L), the edge of the image is determined to calculate the edge line equation, and a small segment in the edge of the image is extracted, K-≤x≤k+,1≤y≤ (L/2), which can take K/10 to adapt to different images, then extract this part of the image.
2.6 Linear Fit
Extract the coordinates of the edge of the image, according to the least squares criterion: To minimize the residuals of all sample observations, that is: Min =min, where ei=y-, to determine the estimated a0,a1,a2......an of unknown parameters, and given the confidence interval C, the smaller the confidence interval, when the curve is fitted, the sample mean value As a fitting point, the higher the reliability, the lower the inverse [3], the edge of the mobile phone image margin Objective fact, the edge is straight, so in the selection of confidence interval C, C should take a smaller interval.
2.7 Determining the center of rotation coordinates
In this paper, according to the three rotary translation of the straight line to determine the rotation center of the platform, after the linear fit can be determined by the three lines are: P1:Y=A1X+C1;P2:Y=A2X+C2;P3:Y=A3+C3. Because the distance of the center m to three is equal, that is, the three straight lines are the three tangents on the spherical surface of the radius d of the center of rotation, and the p1,p2,p3 in the rotation plane 22 intersect to form a triangle, it is known that in the space to the triangular three-side distance is equal to a total of four points, One is the inner D (intersection of the inside corner), and there are 3 e,f,g (the intersection of the outer corner and the nonadjacent lines of the interior angles).
Derive point-to-line distances using the function's method:
The minimum value of the distance from the point m to the line p is the distance from the point m to the line p, and the distance between the points Q (x, y) and the two points can be obtained from the range D, in order to make use of the favorable conditions of the linear equation ax+by+c=0, the coefficients are processed:
(a^2+b^2) [(x-x0) ^2+ (y-y0) ^2]
=a^2 (x-x0) ^2+b^2 (y-y0) ^2+a^2 (y-y0) ^2+b^2 (x-x0) ^2
=[a (x-x0) +b (y-y0)]^2+[a (y-y0) +b (x-x0)]^2
≥[a (x-x0) +b (y-y0)]^2= (ax0+by0+c) ^2 (∵ax+by+c=0)
≥| ax0+by0+c|/()
When and only if A (y-y0) =b (x-x0) is equal, the minimum value is d=| ax0+by0+c|/().
Analysis Figure One shows that there are four points in space D,e,f,g to a straight line p1,p2,p3 distance is equal, that is, the rotating platform may be the center of rotation m, set M (x0,y0), p1,p2,p3 linear equation is the standard formula of linear equation, p1:a1x+b1y+c1=0;p2:a2x+ B2y+c2=0;p3:a3x+b3y+c3=0, there are:
a1x0+b1y0+c1=±d*;
a2x0+b2y0+c2=±d*;
a3x0+b3x0+c3=±d*;
The upper distance value is d,d,d to calculate the M1 point coordinates:
a1x0+b1y0-d* =-c1;
a2x0+b2y0-d* =-c2;
a3x0+b3y0-d* =-c3;
This equation is a super-set of linear equations, which can be solved by the method of matrix: =
Solution: =, thus the distance d from the first center point M1 (x0,y0) and M1 point to the three straight lines can be solved.
The upper distance value is-d,d,d to calculate the M2 point coordinates:
Solution: =
The upper distance value is d,-d,d to calculate the M3 point coordinates:
Solution: =
The upper distance value D,d,-d calculates the M4 point coordinates:
Solution: =
From the above solution can be seen through the above method can be obtained four coordinate points of the rotating platform, in the absence of other constraints, we can rotate the rotating platform three times and record the corresponding actual rotation angle theta, through the four lines of any three straight line calculation of the four coordinate points, Then given the appropriate confidence interval, match the 8 coordinates of the exact same or the distance between two points is less than two points m,n, take the middle coordinate of the two points as the rotation of the platform rotation Center s.
Two experimental results
Figure 2
图2中a为水平放置的一手机原图,b,c,d,e分别为经旋转平台旋转0°,-3°,3°,6°的手机图片经预处理后的图像,图像中显示的图像边缘呈现为间断的白色线条,经放大后可以看出边缘是连通的。确定旋转平台中心位置的结果:
* Select B,c,d to determine center coordinates m
p1=-0.0000 56.0001;
p2=-0.0000 56.0021;
p3=0.0000 55.9949;
M1 =
M2 =
M3 =1.0e+06*
M4 =1.0e+06*
* Select E,c,d to determine center coordinates n
p1=0.0000 55.9892;
p2=-0.0000 56.0021;
p3=0.0000 55.9949;
N1 =
N2 =
N3 =1.0e+06*
N4 =1.0e+06*
结果分析
These results are simplified to:
M=1.0E+06 * n=1.0e+06 *
Given the confidence interval =0.01, in the M,n find two points between the distance ≤ the pair of coordinates, namely: ≤, if there are two points, take the rotation center coordinates: x= (XI+XJ)/2;y= (JI+YJ)/2.
The center coordinates of the platform can be obtained after processing: Z (302.4708,111.9997)
Appendix:
Create a new m file named Findline and enter the code as follows:
function [D]=findline (G)
B=rgb2gray (G);
% h = fspecial (' Gaussian ', 12, 15);
% B=imfilter (b,h); % if the image background has more noise, the Gaussian smoothing filter can be used to filter processing
Bw=edge (B, ' canny ');
C=bwareaopen (bw,900,8);
C=double (C);
[M,n]=size (C);
% find centroid coordinates
sum1=0;
sum2=0;
sum=0;
For i=1:m
For J=1:n
L1=i*c (I,J);
L2=c (I,J);
SUM1=SUM1+L2;
SUM2=SUM2+L1;
j=j+1;
End
End
K=fix (SUM2/SUM1);% centroid axis
For i=1:m
For J=1:n
L1=j*c (I,J);
L2=c (I,J);
SUM1=SUM1+L2;
SUM2=SUM2+L1;
j=j+1;
End
End
L=fix (SUM2/SUM1);% centroid ordinate
A= (L:M);
b= (K-100:K+10);
D=c (A, b);
% subplot (1,2,1);
% imshow (C);
% subplot (1,2,2);
% imshow (D);
Create a new M folder named Ispoly and enter the code as follows:
function [P,b]=ispoly (A)%a for the given image
% h = fspecial (' Gaussian ', 12, 15);
% F=imfilter (a,h);
B=findline (A);
[X,y]=find (b==0);
N=1;
P=polyfit (x,y,n)% fitting the desired linear equation
Create a new M folder named Findpoint and enter the code as follows:
function [X0,y0]=findpoint (A1,C1,B1,A2,C2,B2,A3,C3,B3)
D=SQRT (A1*A1+B1*B1);
a1=a1/d;
b1=a1/d;
c1=c1/d;
D=SQRT (A2*A2+B2*B2);
a2=a2/d;
b2=b2/d;
c2=c2/d;
D=SQRT (A3*A3+B3*B3);
a3=a3/d;
b3=b3/d;
c3=c3/d;
C=[-C1;-C2;-C3];
B1=[A1,B1,-1;A2,B2,-1;A3,B3,-1];
m1=b1\c;
B2=[A1,B1,1;A2,B2,-1;A3,B3,-1];
m2=b2\c;
B3=[A1,B1,-1;A2,B2,1;A3,B3,-1];
m3=b3\c;
b4=[a1,b1,-1;a2,b2,-1;a3,b3,1];
m4=b4\c;
M=[M1 (1), M1 (2); M2 (1), M2 (2); M3 (1), M3 (2); M4 (1), M4 (2)]
④ Create a new M folder, named DeMar, enter the command into the following:
G1=imread (' e:\ certain \ picture \ image processing Picture \pg1.jpg ');
B1=rgb2gray (G1);
% h = fspecial (' Gaussian ', 12, 15);
% B=imfilter (b,h);
Bw1=edge (B1, ' canny ');
C1=bwareaopen (bw1,900,8);
Subplot (2,3,1);
Imshow (G1);
Title (' A (original) ');
Subplot (2,3,2);
Imshow (C1);
Title (' B (0°) ');
C2=imrotate (c1,-3, ' nearest ', ' loose ');
Subplot (2,3,3);
Imshow (C2);
Title (' C ( -3°) ');
C3=imrotate (c1,3, ' nearest ', ' loose ');
Subplot (2,3,4);
Imshow (C3);
Title (' d (3°) ');
C4=imrotate (c1,6, ' nearest ', ' loose ');
Subplot (2,3,5);
Imshow (C4);
Title (' E (6°) ');
P1=ispoly (C1);% fit
P2=ispoly (C2);
P3=ispoly (C3);
K=[P1,-1,P2,-1,P3,-1];
M=findpoint (k (+), K (+), K (1,3), K (1,4), K (1,5), K (1,6), K (1,7), K (1,8), K (1,9))% Find Center coordinate point
P1=ispoly (C4);% fit
P2=ispoly (C2);
P3=ispoly (C3);
K=[P1,-1,P2,-1,P3,-1];
N=findpoint (k (+), K (+), K (1,3), K (1,4), K (1,5), K (1,6), K (1,7), K (1,8), K (1,9))% Find Center coordinate point
Run the DeMar file, you can reach the corresponding m,n point coordinates and distance from three fit straight lines.
Keep the DeMar file running data, create a new folder, named text, the input command is as follows:
For I=1:4
For J=1:4
S=sqrt ((M (i,1)-N (j,1)) ^2-(M (i,2)-N (j,2)) ^2);
If s==0.01
Mx= (M (i,1) +n (j,1))/2
My= (M (i,2) +n (j,2))/2
End
End
End
Z=[mx,my]
The S is a given confidence interval, which can be debugged according to the actual situation, thus calculating the z-point of the rotation center coordinate of the selection plane.
Reference documents:
[1] Berchun. Application of MATLAB in Image edge extraction
[2] Sonko, engineering, Lan Xiaoting. A fast search and tracking algorithm for object centroid of binary image. Harbin Engineering University. Department of Computer and Information science 150001
[3] Yang Xiaohua, Jinping, Yao satellite. Least squares fitting S-N curve with the influence of confidence interval length
Welcome to the Csdn-markdown Editor