Dice Point Recognition-image segmentation

Source: Internet
Author: User

Preface

The previous time borrowed neural network and convolutional neural network to realize the recognition of dice points, but a very serious problem has been bothering me, that is when the two dice stacked together, the two dice is not a simple thing to separate.

Is the unrecognized, superimposed picture material that I generated during the identification process.



In the face of these different shapes of pictures, sometimes two dice a corner together, there is a side, there are three of dice stacked together. Therefore, it is difficult to find a satisfactory solution to this problem.

The first idea is to start from the original RGB image, by adjusting the value of the binary threshold, I hope to be able to separate the dice object, but unfortunately I tried several methods, is not possible, because the original image in the place of the handover is very vague, the color changes are very small, So it's hard to get a perfect solution using a two value threshold adjustment.

I tried different ways during the period.1. Watershed


close ALLCLC figure (1) subplot (231) rgb_img=imread (' 161220s010129.jpg '); imgsize =size (RGB_IMG); rgb_img = Imcrop (rgb_img,[imgsize) *0.418 imgsize (215 134]);% most image layout fixed *0.655 (imshow) rgb_img (232)% Imhist (A (:,:, 1)); BW=IM2BW (Rgb2gray (rgb_img)); BW=MEDFILT2 (BW);p lanes=bwareaopen (bw,100); Imshow (planes)%%subplot (233) D=bwdist (Imcomplement (planes));D =mat2gray (d); Imshow (d) figuresubimage (d) hold On[c,h]=imcontour (d,0.2:0.2:0.8); Set (H, ' showtext ', ' on ', ' Textstep ', get (H, ' levelstep ') * *) Text_handle = Clabel (c,h, ' Color ', ' g '); figure (1)%%subplot ( 234) M=imimposemin (Imcomplement (D), d>.8); Imshow (m);%%subplot (236) l=watershed (m); R=l & Planes;imshow (R)%%%% %%%%%%%%stats=regionprops (R, ' BoundingBox ', ' centroid '); hold Onc=cat (1,stats. centroid);p Lot (C (:, 1), C (:, 2), ' r* ') bb={stats. Boundingbox};cellfun (@ (x) Rectangle (' Position ', X, ' Edgecolor ', ' y '), BB)%%subplot (235) L (R) =5;imshow (l,[]) 
2. Extraction of local re-value to find connected areas



Close all; Rgb_img=imread (' 161221s010029.jpg '); imgsize =size (RGB_IMG); rgb_img = Imcrop (rgb_img,[imgsize) *0.418 imgsize (215 134]);% Most image layouts are fixed *0.655 (gry_img=rgb2gray); level = Graythresh (gry_img); BW_IMG=IM2BW (gry_img,0.7); Bw_img =imclearborder (bw_img,8);            [Img1,map] = Rgb2ind (rgb_img,64);  % # Create your quantized imagerplane = reshape (Map (img1+1,1), size (IMG1));  % # Red color plane for imagegplane = reshape (Map (img1+1,2), size (IMG1));  % # Green color plane for imagebplane = reshape (Map (img1+1,3), size (IMG1)); % # Blue color plane for imagefigure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (2, 2, 1); Imshow (Rplane, []); Title (' R '); subplot (2, 2, 2); Imshow (Gplane, []); Title (' G '); subplot (2, 2, 3); Imshow (Bplane, []); Title (' B '); subplot (2, 2, 4); Imshow (gry_img, []); Title (' O '); figure (' units ', ' normalized ', ' position ', [0 0 1 1]); I2 = Bwareaopen (bw_img,100,8);% Delete binary image BW with an area less than p in the object, The 8 neighborhood is used by default. CC = Bwconncomp (i2,8),%bwconnecomp () is the area connected in the two-value image, CC Returns the result, such as a picture (simplified for easy understanding):N=CC. numobjects;% how many objects k = Regionprops (cc, ' area ', ' Perimeter ', ' majoraxislength ', ' minoraxislength ', ' Image '); The% use is get the Properties of region, which is the function used to measure the image area properties. Subplot (2, 2, 1); Imshow (bw_img, []); Title (' O '); subplot (2, 2, 2); Imshow (I2, []); Title (' G '); [M,n] = find (i2==1); Max_x=max (m); Max_y=max (n); min_x=min (m); Min_y=min (n); New_img=rgb_img (Min_x:max_x,min_y:max_y, :); subplot (2, 2,3); Imshow (new_img, []); Title (' G '); NEW_BW_IMG=IM2BW (new_img,0.7); subplot (2, 2,4); Imshow (new_bw_img, []);     Title (' New_bw_img ');%figure (' units ', ' normalized ', ' position ', [0 0 1 1]); For i=1:n% subplot (2, 2, i); Imshow (k (i). Image, []);    Title (' T ');      End


3. K-means classification


Close all;  Clear all;    CLC    c_segments=3; Img_original=imread (' 161224s011389.jpg '); imgsize =size (img_original); img_original = Imcrop (Img_original,[imgsize (    *0.418 imgsize *0.655 215 134]);% Most image layouts are fixed figure,imshow (img_original), title (' original image ');  % Display original image Img_gray=rgb2gray (img_original);    Figure,imshow (Img_gray), title (' Primitive grayscale image ');    % gets the long width of the image [M,n]=size (Img_gray);  % gray threshold calculation T=graythresh (Img_gray);  IMG_BW=IM2BW (img_gray,t);    Figure,imshow (IMG_BW), title (' Primitive binary image ');    % The image is rgb--3 Channel decomposition A = Reshape (img_original (:,:, 1), m*n, 1);  % convert RGB components to Kmeans using the data format n lines, same as this B = Reshape (img_original (:,:, 2), m*n, 1);  C = Reshape (img_original (:,:, 3), m*n, 1);  DAT = [A B C];      The% r G B component consists of the characteristics of the sample, each sample having three attribute values, a total of width*height samples Crgb = Kmeans (double (DAT), c_segments,...      ' Distance ', ' City ',...      ' Emptyaction ', ' Singleton ',...    ' Start ', ' sample ');     % using clustering algorithm is divided into 2 classes Rrgb = Reshape (Crgb, M, N);   % reverse conversion to picture form figure, Imshow (Label2rgb (RRGB)), title (' RGB channel segmentation result '); % display split result% will imageSingle channel grayscale decomposition grayseg= reshape (img_gray (:,:), m*n, 1);  Cgray=kmeans (double (grayseg), 2);     rgray= reshape (Cgray, M, N);   % reverse conversion to picture form figure, Imshow (Label2rgb (Rgray)), title (' Grayscale channel segmentation result '); % Display split results


4.sobel operators and watershed This is the official MATLAB example



CLC Clear all; Close all;% First step: Read the color image, convert it to grayscale image rgb = Imread (' abc.jpg '), if ndims (RGB) = = 3 I = Rgb2gray (RGB); Else I = Rgb;endfigure (' Units ', ' normalized ', ' position ', [0 0 1 1]; subplot (1, 2, 1); Imshow (RGB); Title (' original '); subplot (1, 2, 2); Imshow (I); Title (' grayscale '); 2nd step: Use the Sobel edge operator to filter the image horizontally and vertically using the gradient amplitude as the partition function, and then to get the modulus value, the Sobel operator filtered image will show a larger value at the boundary, and the value at no boundary will be very small. hy = fspecial (' Sobel '); the%fspecial function is used to establish a pre-defined filter operator HX = hy '; Iy = IMFilter (double (I), HY, ' replicate '), and% implements a linear spatial filter function. Function: Filter on any type array or multidimensional image Ix = IMFilter (double (I), HX, ' replicate '), Gradmag = sqrt (ix.^2 + iy.^2); figure (' units ', ' normalized ' , ' position ', [0 0 1 1]); subplot (1, 2, 1); Imshow (i,[]), title (' Grayscale image ') subplot (1, 2, 2); Imshow (gradmag,[]), title (' Gradient amplitude image ') percent of the 2nd step: Mark Foreground object% There are several ways that you can apply this to get a foreground marker, which must be a connection blob pixel inside the foreground object. In this example, morphological techniques are used to clean up images based on "open reconstruction" and "closed-based reconstruction". % These operations will create a unit maximum within each object so that Imregionalmax can be used to locate it. % open and closed operations: first corrosion after expansion known as Open, first expansion after corrosion called closed. Open and close these two operations can remove specific image details that are smaller than the structure element, while ensuring that no global geometric distortion is generated. The open operation can filter out the spikes that are smaller than the structural elements, cut off the slender lap and play the role of separation; The% closed operation can fill the notch or hole that is smaller than the structure element, and connect the short interval. % open operation is corrosion after inflationExpansion, based on the reconstruction of the open (based on the opening operation of the reconstruction) is the morphological reconstruction after corrosion. These two methods are compared below. Percent of the first, use Imopen to open the operation. SE = Strel (' disk ', 2); Structure element object for expansion corrosion and opening and closing operations. Concrete Usage: Se=strel (shape,parameters) creates structural elements that correspond to shapes of the specified shape. Io = Imopen (I, se);% open operation is generally to make the contour of the object become smooth, break the narrow discontinuity and eliminate the thin protrusion figure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (1, 2, 1) ; Imshow (I, []); Title (' Grayscale image '); subplot (1, 2, 2); Imshow (Io), title (' image on Operation ') percent of the next, the reconstruction based on the corrosion is done by the rebuilding calculation. ie = Imerode (i, se); iobr = imreconstruct (ie, i); figure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (1, 2, 1); Imshow (I, []); Title (' Grayscale image '); subplot (1, 2, 2); Imshow (IOBR, []), title (' Based on reconstructed image ') after a percent open operation, then closed operation, you can remove darker spots and branch marks. Compared with conventional morphological closed operations and closed-based reconstruction operations. First, use IMCLOSE:IOC = Imclose (Io, se), Ic = Imclose (I, SE), figure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (2, 2, 1); Imshow (I, []); Title (' Grayscale image '); subplot (2, 2, 2); Imshow (Io, []); Title (' Open operation Image '); subplot (2, 2, 3); Imshow (Ic, []); Title (' Closed Operation Image '); subplot (2, 2, 4); Imshow (Ioc, []), title (' Open and close operation '),% now using imdilate, then use Imreconstruct. Attention must be paid to the input image, and the imreconstruct output image will be mended. %IM2 = Imcomplement (IM) computes the complement of an image im. An IM can be a two-value image, or an RGB image. IM2 has the same data type and size as IM. IOBRD = imdilate (iobr, SE);% utilizing structural elements se expands iobrcbr = imreconstruct (Imcomplement (IOBRD), Imcomplement (IOBR)); IOBRCBR = Imcomplement (IOBRCBR); figure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (2, 2, 1); Imshow (I, []); Title (' Grayscale image '); subplot (2, 2, 2); Imshow (Ioc, []); Title (' Opening and closing operation '); subplot (2, 2, 3); Imshow (IOBR, []); Title (' Reconstructed image based on open '); subplot (2, 2, 4); Imshow (IOBRCBR, []), title (' Closed reconstructed image ');% by comparing IOBRCBR and LOC, it can be seen that in the application of removing small stain without affecting the global shape of the object, the opening and closing operation based on reconstruction is more effective than the standard open and close reconstruction. % calculates the local maxima of the IOBRCBR to get a better foreground marker. FGM = Imregionalmax (IOBRCBR); figure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (1, 3, 1); Imshow (I, []); Title (' Grayscale image '); subplot (1, 3, 2); Imshow (IOBRCBR, []); Title (' Open and close operation based on Reconstruction '); subplot (1, 3, 3); Imshow (FGM, []); Title (' Local maximal image '); In order to help understand the result, the overlay foreground is labeled on the original. IT1 = RGB (:,:, 1); It2 = RGB (:,:, 2); It3 = RGB (:,:, 3); It1 (FGM) = 255; It2 (FGM) = 0; IT3 (FGM) = 0;i2 = Cat (3, IT1, It2, IT3), figure (' units ', ' normalized ', ' position ', [0 0 1 1]); SubPlot (2, 2, 1); Imshow (RGB, []); Title (' original image '); subplot (2, 2, 2); Imshow (IOBRCBR, []); Title (' Open and close operation based on Reconstruction '); subplot (2, 2, 3); Imshow (FGM, []); Title (' Local maximal image '); subplot (2, 2, 4); Imshow (I2); Title (' Local maximum overlay to original image '); percent note that most of the occlusion and shadow objects are not marked, which means that these objects will not be properly segmented in the results. In addition, some objects have foreground markers that go to the edge of the object. This means that the edges of the marked spots should be cleaned up and then shrunk. Can be done by closed operation and corrosion operation.  Se2 = Strel (Ones (5,5)); fgm2 = Imclose (FGM, se2); fgm3 = Imerode (fgm2, SE2); figure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (2, 2, 1); Imshow (IOBRCBR, []); Title (' Open and close operation based on Reconstruction '); subplot (2, 2, 2); Imshow (FGM, []); Title (' Local maximal image '); subplot (2, 2, 3); Imshow (FGM2, []); Title (' closed operation '); subplot (2, 2, 4); Imshow (FGM3, []); Title (' corrosive operation '); The process will leave some stray pixels that deviate, and should remove them. % You can use Bwareaopen to remove spots that are less than a specific number of pixels. BW2 = Bwareaopen (bw,p) is removed from the two-value image so that the connected block is less than the P-pixel value, resulting in an additional two-value image BW2. FGM4 = Bwareaopen (FGM3); It1 = RGB (:,:, 1); It2 = RGB (:,:, 2); It3 = RGB (:,:, 3); It1 (FGM4) = 255; It2 (FGM4) = 0; IT3 (FGM4) = 0;i3 = Cat (3, IT1, It2, IT3); figure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (2, 2, 1); Imshow (I2, []); Title (' Local maxima superimposed to original image '); subplot (2, 2, 2); Imshow (FGM3, []); Title (' Closed corrosion operation '); subplot (2, 2, 3); Imshow (FGM4, []); Title (' Remove small speck operation '); subplot (2, 2, 4); Imshow (I3, []); Title (' Modify local great overlay to original image '); 4th step: Calculate background marker% now, you need to mark the background. In the cleaned image iobrcbr, the dark pixels belong to the background, so you can start with a threshold action. BW = IM2BW (IOBRCBR, Graythresh (IOBRCBR)); figure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (1, 2, 1); Imshow (IOBRCBR, []); Title (' Open and close operation based on Reconstruction '); subplot (1, 2, 2); Imshow (BW, []); Title (' Threshold Split '); the percent background pixels are in the black area, but ideally, you don't need to ask the background marker to be too close to the edge of the object you want to split. Percent "refine" the background, or skiz,bw the foreground, by calculating the "skeleton influence range". This can be achieved by calculating the watershed transformation of the BW distance transformation, and then finding the result of the watershed Ridge (dl==0). %%d=bwdist (BW) computes the Euclidean matrix of the binary image bw. For each pixel of bw, the distance transforms the distance between the specified pixel and the nearest BW non 0 pixels. %%bwdist uses Euclidean distance formula by default. BW can be any number of dimensions, and D is the same size as bw. D = bwdist (BW);D L = watershed (D); BGM = DL = = 0;figure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (2, 2, 1); Imshow (IOBRCBR, []); Title (' Open and close operation based on Reconstruction '); subplot (2, 2, 2); Imshow (BW, []); Title (' Threshold Split '); subplot (2, 2, 3); Imshow (Label2rgb (DL), []); Title (' Watershed Transformation '); subplot (2, 2, 4); Imshow (BGM, []); Title (' Watershed Transformation Ridge Map '); the 5th step:The watershed transformation of the computed segmentation function Gradmag2 = Imimposemin (Gradmag, BGM | fgm4); figure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (2, 2, 1); Imshow (BGM, []); Title (' Watershed Transformation Ridge Line Graph '); subplot (2, 2, 2); Imshow (FGM4, []); Title (' foreground marker '); subplot (2, 2, 3); Imshow (Gradmag, []); Title (' Gradient amplitude image '); subplot (2, 2, 4); Imshow (Gradmag2, []); Title (' Modify gradient amplitude image '); percent of the last, we can do image segmentation based on watershed. 6th step: View Results It1 = RGB (:,:, 1); It2 = RGB (:,:, 2); It3 = RGB (::, 3); FGM5 = imdilate (L = = 0, ones (3, 3)) | BGM | Fgm4;it1 (FGM5) = 255; It2 (FGM5) = 0; IT3 (FGM5) = 0;i4 = Cat (3, IT1, It2, IT3); figure (' units ', ' normalized ', ' position ', [0 0 1 1]); subplot (1, 2, 1); Imshow (RGB, []); Title (' original image '); subplot (1, 2, 2); Imshow (I4, []); Title (' Tag and object edge overlay to original image ');


The above four methods, although all just a scratch, but to deal with the problems I encountered, using the most readily available methods, can not have a good effect. I hope that there is no ready-made, can be used directly to the method? I couldn't find it for a long time. I am also a novice, on the image processing completely do not understand, only recently, and I was doing biological science, ideas more, I came up with another method, called rotary cutting method.

Rotary Cutting Method



This method is good for a picture that has only two dice to be linked together. The principle is actually very simple. Let me introduce you how to get here.

At first, I did a detailed summary analysis of the dice without splitting, found that two dice linked to the picture basically has a central point, the center is basically in the central location of the picture. If I take this center point as the endpoint, draw a line, rotate 360 degrees to scan every angle, and if that angle has the largest percentage of black pixels, then it proves that I should cut along this line. Then find another same line, but the angle is more than 90 degrees.


The idea was beautiful, but it didn't work out after it was realized. Because the image is a matrix, coordinates can be integers only, and at some point you will not find any pixels on the split line. This is a lot more.

Please look at the code:

Clear All;close all;img=imread (' stackdice/161221s011172.jpg-1.jpg '); IMG=IM2BW (IMG); figure;imshow (img,[]); s_x=1;s _y=1; [M,n] = Size (img);d =ceil (sqrt (m^2+n^2)),% diameter New_matrix = zeros (d,d);  T2 = [1,0,0;0,1,0;-m/2,-n/2,1];    %x, y-axis translation value origin T3 = [1,0,0;0,1,0;m/2,n/2,1];    %x, y-axis inverse translation T4 = [1,0,0;0,1,0;-(d/2+m),-(D/2+n), 1];    %x, y-axis inverse translation T5 = [1,0,0;0,1,0; (d-m)/2, (D-n)/2,1];   %x, y-axis inverse translation T1 = [s_x,0,0;0,s_y,0;0,0,1];     % corresponds to the scale coefficient matrix%t = t2*t1*t3;     %p_new = P_old*t2*t1*t3 order cannot be wrong t = T4*T5; %p_new = P_old*t2*t1*t3 order can not be wrong for the i=1:d for j=1:d P = Floor ([i,j,1]*t5^-1);% by P_new = p_old*t available: P_old = p_new* ( t^-1) if (P (1) <=m) && (P (1) >0) && (P (2) <=n) && (P (2) >0)% limiting range New_matrix (i   , j) = IMG (P (1), P (2));     % Coordinate transformation Relationship else New_matrix (i,j) = 0; % does not have a point assignment of 0 end endend%% above to get a new picture picture added around the center_x=d/2;center_y=d/2;% Center coordinate figure;imshow (new_matrix,[]); hold On;point group=[center_x center_y;d d/2];% Initial position anglegroup= (0:PI/4:2*PI); init_line_value= cell (D/2); for I=1:length (Anglegroup) theat=anglegroup (i);    Pointgroup (2,1) =ceil ((cos (theat) +1) *D/2);    Pointgroup (2,2) =ceil ((Sin (theat) +1) *D/2);    Plot (Pointgroup (:, 1), Pointgroup (:, 2), '.-', ' Color ', [(0.7+0.1/i) ^2 (0.05*i) 0.5/i^2]); If Pointgroup (2,1) ==0 pointgroup (2,1) =1 End If Pointgroup (2,2) ==0 pointgroup (2,2) =1 end Pointgro    Up A=pointgroup (1,:);%A Point B=pointgroup (2,:);            If B (1)-A (1) ==0 if A (2)-B (2) >0 tmpl= New_matrix (A (1), B (2): A (2));                    Init_line_value{i,1:length (Tmpl)}=tmpl;            else tmpl= New_matrix (A (1), B (2): A (2));                    Init_line_value{i,1:length (Tmpl)}= New_matrix (A (1), A (2): B (2));    End continue;            End If B (2)-A (2) ==0 if A (1)-B (1) <0 tmpl= New_matrix (A (1): B (1), a (1));                    Init_line_value{i,1:length (Tmpl)}=tmpl;            else tmpl= New_matrix (B (1): A (1), a (1)); Init_line_value{i,1:length(Tmpl)}                    =tmpl;    End continue; End k= (b (2)-A (2))/(b (1)-A (1)); %k is a coefficient; B=a (2)-k*a (1);%b is a constant.        (equation: y=k*x+b).                % of four cases of rotation if Pointgroup (2,1) >=d/2 && pointgroup (2,2) &GT;=D/2 kkaa=1;            For Kk1=d/2:pointgroup (2,1) Init_line_value{i,kkaa}=new_matrix (Kk1,ceil (k*kk1+b));        kkaa=kkaa+1;        End End If Pointgroup (2,1) <d/2 && pointgroup (2,2) &GT;D/2 kkaa=1;            For Kk1=pointgroup (2,1):d/2 Init_line_value{i,kkaa}=new_matrix (Kk1,ceil (k*kk1+b));        kkaa=kkaa+1;                End End If Pointgroup (2,1) >d/2 && pointgroup (2,2) &LT;D/2 kkaa=1;            For Kk1=d/2:pointgroup (2,1) Init_line_value{i,kkaa}=new_matrix (Kk1,ceil (k*kk1+b));        kkaa=kkaa+1;                End End If Pointgroup (2,1) <d/2 && pointgroup (2,2) &LT;D/2 kkaa=1; For Kk1=pointgroup (2,1):d/2 Init_line_vaLue{i,kkaa}=new_matrix (Kk1,ceil (k*kk1+b));        kkaa=kkaa+1; End End End

then, since the selection line is not good, you can only rotate the entire image. Why would I consider floss first? Matrix rotation and vector rotation, naturally, is a lot less vector operations. Sure enough, for the entire picture to rotate, and then fixed in the horizontal position after sampling, the execution time is much slower.


Clear All;close all;img=imread (' stackdice/161221s011161.jpg-2.jpg '); IMG=IM2BW (IMG); s_x=1;s_y=1; [M,n] = Size (img);d =ceil (Floor (sqrt (m^2+n^2));% diameter if mod (d,2) ~=0 D=d+1;endnew_matrix = zeros (d,d);  T2 = [1,0,0;0,1,0;-m/2,-n/2,1];    %x, y-axis translation value origin T3 = [1,0,0;0,1,0;m/2,n/2,1];    %x, y-axis inverse translation T4 = [1,0,0;0,1,0;-(d/2+m),-(D/2+n), 1];    %x, y-axis inverse translation T5 = [1,0,0;0,1,0; (d-m)/2, (D-n)/2,1];   %x, y-axis inverse translation T1 = [s_x,0,0;0,s_y,0;0,0,1];     % corresponds to the scale coefficient matrix%t = t2*t1*t3;     %p_new = P_old*t2*t1*t3 order cannot be wrong t = T4*T5; %p_new = P_old*t2*t1*t3 order can not be wrong for the i=1:d for j=1:d P = Floor ([i,j,1]*t5^-1);% by P_new = p_old*t available: P_old = p_new* ( t^-1) if (P (1) <=m) && (P (1) >0) && (P (2) <=n) && (P (2) >0)% limiting range New_matrix (i   , j) = IMG (P (1), P (2));     % Coordinate transformation Relationship else New_matrix (i,j) = 0; % does not have a point assignment of 0 end endend%% above to get a new picture picture added around the center_x=d/2;center_y=d/2;% Center coordinate figure;imshow (new_matrix,[]); hold On;point group=[center_x center_y;d d/2];% Initial position anglegroup= (0:PI/4:2*PI); forI=1:length (Anglegroup) theat=anglegroup (i);    Pointgroup (2,1) =ceil ((cos (theat) +1) *D/2);    Pointgroup (2,2) =ceil ((Sin (theat) +1) *D/2);        Plot (Pointgroup (:, 1), Pointgroup (:, 2), '.-', ' Color ', [(0.7+0.1/i) ^2 (0.05*i) 0.5/i^2]);    ENDfor i222=1:length (Anglegroup) Theat=anglegroup (i222);    Newroate= rotatefunction (new_matrix,theat);    X1_pix (:,:, i222) =newroate (D/2,D/2:D);        X2_pix (:,:, i222) =newroate (D/2,1:D/2);    Endmax=0;max_index=0;for i=1:length (X1_pix (:)) tmp= Length (Find (X1_pix (::, i) ==0));    Tmp_rate = Tmp/length (X1_pix (:,:, i));        If Tmp_rate>max max=tmp_rate;    Max_index=anglegroup (i);    Endendfor i=1:length (X2_pix (:)) tmp= Length (Find (X2_pix (::, i) ==0));    Tmp_rate = Tmp/length (X2_pix (:,:, i));        If Tmp_rate>max max=tmp_rate;    Max_index=anglegroup (i); Endendnewroate_final= rotatefunction (New_matrix,max_index); img1=newroate_final (1:d/2,1:d); Img2=newroate_final (d /2:D,1:D); Figure;imshow (img1,[]); Figure;imshow (img2,[]); 


In any case, use it first. I don't have a good way for the three dice to be connected to each other for the time being. But for now, my design is enough for us to use in a formal production environment.

This is how I split the effect:


Summary:

In general, try a lot of many methods, perhaps the other algorithms understand not in-depth, so there is no good results, but the most suitable, is the best.




Dice Point Recognition-image segmentation

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.