Fuzzy c-means clustering based on kernel method

Source: Internet
Author: User

Summary:

In this paper, the FCM algorithm is mainly limited to deal with the shortage of star cluster data, and the kernel method is introduced to optimize the algorithm.

as with many clustering algorithms, FCM chooses Euclidean distance as the non-similarity index between sample points and corresponding cluster centers, which leads to the discovery of clusters with similar scales and densities. Therefore, FCM is largely limited to the processing of star clusters, not universal. The kernel function in support vector machine can be used to map data to high dimensional feature space for feature extraction and clustering. At present, nuclear methods have been widely used in fuzzy clustering analysis algorithm. The application of nuclear method has become one of the hotspots in computer intelligence, which is very important for the study of nuclear learning. This paper mainly describes the application of nuclear method to FCM algorithm.

a Overview of nuclear methods

1. Brief introduction of nuclear methods

The kernel method originates from the support vector Machine (SVM) theory, embeds the data into the corresponding feature space (Hilbert space) through the characteristic map, discovers the linear pattern of the data in the characteristic space, then chooses the corresponding kernel function to calculate the inner product by the input, in which the function of the characteristic map φ is to re-encode the data. Thus discovering the linear pattern between the data. Based on the dual method, the information needed for the algorithm is located in the inner product of the data points in the feature space, so the inner product can be used as the direct function of the input feature, and the time complexity of the algorithm is also reduced. Therefore, nuclear methods are often used to deal with inefficient, characteristic-like problems.

Kernel function is the key to the application of nuclear method, and it solves the problem of nonlinearity and high dimensionality to a great extent. The kernel function makes implicit non-linear mapping of the data, so it can calculate the inner product and make the algorithm more efficient and flexible without concrete mapping. The essence of kernel function is to solve the similarity problem between the data, the kernel function can have many different forms, and the construction methods are various, now we have obtained many kernel functions, different kernel functions can be used as different similarity indexes, and the FCM algorithm is the measure of the distance between sample point and Cluster center.

Based on the above analysis, we need to solve the problem of cluster analysis of non-star cluster. The model-Extensibility theorem for t.m.cover: A complex pattern analysis problem is mapped to a high-dimensional space, which can be divided linearly with the lower-dimensional space. Combined with support vector machine, the kernel method can be used to map the data using Mercer kernel to high dimensional feature space for feature extraction, and the corresponding kernel function is selected for cluster analysis.

2. Progress in the study of nuclear methods

as early as the 1960s, some scholars have introduced kernel function in pattern recognition, and the kernel method has been developed rapidly until support vector machine is successfully applied to pattern recognition. Scholkopfs and other people in the feature extraction first applied to the idea of nuclear learning, the nuclear principal component Analysis (KPCA), and then, Mika, Baudt, Roth and other people using nuclear learning methods to expand the linear discriminant analysis algorithm to the kernel discriminant analysis algorithm. The research of nuclear methods is becoming more and more active, and a lot of algorithms have sprung up, and after the unremitting efforts of scholars, the nuclear method has formed a more complete study system. At present, the research on nuclear methods can be divided into: classification algorithm based on kernel learning, clustering algorithm based on kernel learning (kernel clustering) and neural network based on kernel learning.

Nuclear cluster is an important branch of development in recent years, and has achieved fruitful results. With the deepening of nuclear learning, nuclear learning has been widely used in the fields of character recognition, face recognition, image retrieval, target tracking, image processing and text categorization. It can be asserted that the nuclear method will have a wider application prospect.

3. Nuclear fuzzy c-mean clustering

the algorithm chooses the corresponding kernel function instead of Euclidean distance in FCM algorithm. combined with FCM algorithm, we analyze the kernel fuzzy C-means clustering, which we can represent as KFCM.

We know that the objective function of FCM is:      

    

The constraints are:

         

Now to introduce a kernel method, you can convert the target function to:

       

where φ refers to a feature map, depending on the conversion technique in the kernel method, we can do the following conversions:

      

Here you select the Gaussian kernel:

Under

combined K (x,x) = 1, the objective function can be converted to:      

In order to minimize the objective function and combine the constraint conditions, the updating formulas of the clustering center and the membership degree matrix are:

According to the above formula, we can find the membership degree of satisfying condition and the cluster center to minimize the objective function, and ensure the convergence of the algorithm, the concrete algorithm steps are as follows:

Before the algorithm starts, it must be given a dataset x consisting of n l-dimensional vectors, and the number of categories to be divided by C, to customize the membership matrix.

(1) Set the number of categories C and the fuzzy coefficient m;

(2) initializing the membership degree matrix and satisfying the condition of normalization;

(3) According to the Formula Cluster center;

(4) Updating the membership degree matrix according to the formula;

(5) The membership degree matrix of the iteration is compared according to the matrix norm, if the convergence, the iteration stops, otherwise returns (3).

Two MATLAB realization

Here we select the same selection of Mr Image implementation, the establishment of the MKFC.M file, the code is as follows:

%%%%%mkfcmclear ALLCLC; a=imread ('mri.jpg'); I=imnoise (A,'Salt & Pepper',0.03); [Height,width,c]=size (a);ifc~=1a=Rgb2gray (a); Enda=Double(a); [Row,column]=size (a);%%several parameters Beta= +; Cluster_num=4;%divide the image into four categories Default_options= [2.0; % exponent forThe partition matrix U -; %max. Number of iteration0.01; %min. amount of improvement1]; %info Display during iteration options=default_options;m= Options (1); %p, image quality parameter Iter_num= Options (2); %max. Iteration Maximum Iteration count threshold= Options (3); %min. improvement minimum evolutionary step display= Options (4); %Display info or not displays information or not costfunction= Zeros (Iter_num,1);%%%%%%%%%%%%%%%%%%%%%%%Tic%%Initialize the membership level and return to membership=zeros (height,width,cluster_num); forI=1: Height forj=1: Width member_sum=0;  fork=1: Cluster_num Membership (I,J,K)=rand (); Member_sum=member_sum +Membership (I,J,K); End forp =1: Cluster_num Membership (I,J,P)=membership (i,j,p)/member_sum; End Endendcenter= Zeros (Cluster_num,1); foro =1: Iter_num%Iteration Count Control%%Calculate the initial center forU =1: Cluster_num sum=0; Sum1=0;  forH=1: Height fort=1: Width sum= sum + membership (H,T,U). ^M * EXP (-1) * (A (h,t)-Center (u). ^2)/(beta.^2)) *A (h,t); Sum1= sum1 + Membership (h,t,u). ^M * EXP (-1) * ((A (h,t)-Center (u)) ^2)/(beta.^2)); End End Center (u)= SUM/sum1; End forh =1: Height fort =1: Width forK =1: Cluster_num costfunction (o)= Costfunction (o) +2* Membership (H,T,K). ^m * (1-Exp (-1* ((A (h,t)-center (k)). ^2)/(beta.^2))); Temp=(1-Exp (-1* ((A (h,t)-center (k)). ^2)/(beta.^2))).^(-1/(M-1)); Top=0;  forp =1: Cluster_num Top= Top + (1-Exp ((-1* (A (h,t)-Center (p)). ^2)/(beta.^2))).^(-1/(M-1)); End Membership (H,T,K)= temp/top; End End EndifDisplay, fprintf ('Iter.count =%d, obj. FCN =%f\n', O, Costfunction (o)); Endifo >1        ifABS (Costfunction (o)-costfunction-O1)) < threshold, Break; end; EndEnd TOC%%Output Image newing=zeros (row,column); forI=1: Row forj=1: Column Maxmembership=membership (I,j,1); Index=1;  fork=2: Cluster_numif(Membership (I,J,K) >maxmembership) maxmembership=Membership (I,J,K); Index=K; End End Newimg (i,j)= Round (255* (1-(index-1)/(cluster_num-1))); Endendsubplot (2,2,1), Imshow (Uint8 (a)); Title'Original') Subplot (2,2,2); Imshow (I); Title ('noise-Cancelling images'); subplot (2,2,3); Imshow (Uint8 (newimg)); Title ('KFCM Segmented Image');

Results can be obtained:

      

Compared with the previous FCM's segmentation results, there is a great improvement. Combined with the correlation between spatial data and the neighborhood information of each point, the neighborhood information is added in the original code, which is analyzed previously:

%%Neighborhood Information XK=zeros (height,width); fori =1: Height forj =1: Width up= i-1; down= i +1; Left= J-1; Right= j +1; if(Up <1) up=i; Endif(Down >height) down=height; Endif(Left <1) Left=width; Endif(Right >width) right=height; End A= Zeros (1,6);  forx =Up:down fory =left:right A=A (x, y); End End Xk (i,j)=mean (A); EndEnd

This allows for further optimization of the algorithm.

well, the application of nuclear methods to FCM algorithm is said so much today, in a word, the application of nuclear methods has really improved the performance of the algorithm a lot.  

Fuzzy c-means clustering based on kernel method

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.