The difference between overlay algorithm and SVM

Source: Internet
Author: User
Tags svm

SVM is a learning method based on statistical learning theory, on the basis of the risk minimization principle of VC and structure of statistical learning theory, the best compromise between the complexity and learning ability of model is found based on the limited sample information, and the two kinds of samples are separated in the sample space or the characteristic space structure optimal super plane. To obtain the best promotion ability. For the linear separable two classification problem, the classification super plane with the maximum interval constructed by the support vector separates the samples. For linear irreducible problems, the sample space is mapped to the high-dimensional feature space by choosing the proper kernel function, then the optimal classification hyper-plane is constructed in the feature space, and the traditional method of increasing the network performance by adding neurons is transformed into a method to reduce the complexity of the sample. It is mainly used for data mining, handwriting recognition, disease diagnosis, image classification, bioinformatics and other fields. and has been successfully applied.

However, SVM mainly studies the law of machine learning under small samples, whether it is linear or non-linear, it is very difficult to solve complex two-time programming problems. In addition, SVM is essentially used to solve the two classification problem, which is often used for the classification of 2 categories, multi-class classification problem, the need to construct SVM decision tree, take multiple classification method, large computational capacity and low efficiency.

The same as the overlay algorithm as a structural learning method, can deal with a large number of samples, according to the characteristics of the sample, for the characteristics of the learning sample with the super-plane cutting spherical surface to form a "spherical domain" as a neuron to construct a network, classification ability, fast operation, suitable for multi-category classification problem, and readability is strong, Parameter determination is relatively easy. (from a book, the NCA part C + + code is given below)

void Sample_train () {clock_t T1, T2;
	T1=clock ();
	Sortsample ();
	int seq=0;
	int coved_num=0;
	Srand ((unsigned) time (NULL));
		for (size_t t=0; t<i.size (); ++t) {vector<int> v;
		for (size_t i=0; i<i[t]->v.size (); ++i) {v.push_back (i);
		} int uncovedn=i[t]->v.size ();
			while (!v.empty ()) {Random_shuffle (V.begin (), V.end ());
			int s=v[0];
				if (i[t]->v[s]->covered==1) {v.erase (V.begin ());//???
			Continue
			} double D1=find_diffmin_d (S, t);
			Double D2=find_samemax_d (S,T,D1);

			Double d= (D1+D2)/2;

			Coved_num=cover_sample (S,T,D);
			Cover *c=new Cover;
			int k=i[t]->v[s]->id;
			int dim=samples[k]->dim;
			c->seq=seq;
			C->center=new Double[dim];
			for (int ii=0; ii<dim; ++ii) {C->center[ii] = samples[k]->x[ii];
			} c->cls=samples[i[t]->v[0]->id]->y;
			c->r=d;

			C.push_back (c);
			Uncovedn-= Coved_num;
			seq++;
		V.erase (V.begin ());

	}} t2=clock (); VEctor<expresult *>::reverse_iterator it = Result.rbegin ();
	(*it)->covnum=c.size ();
	(*it)->trnum = Samples.size ();
(*it)->trtime = (double) (T2-T1)/clocks_per_sec;
	} void Sample_test () {int correct=0;
	int refuse=0;

	int uc=0;
	clock_t T1, T2;
	
	T1=clock ();
		for (size_t i = 0; i < samples.size (); i++) {Double cnt_nearest =-dbl_max;
		int k =-1;

		size_t j=0;
			while (J<c.size ()) {double d=inner_product (*samples[i], c[j]->center);
			Double r=c[j]->r;
				if (cnt_nearest< (d-r)) {cnt_nearest = D-r;
			K=j;
		} ++j;
			}//while if (Cnt_nearest < 0) {++refuse;
			if (samples[i]->y==c[k]->cls) {++uc;
			}} else {if (samples[i]->y = = c[k]->cls) {++correct;

	}}}//for t2 = clock ();
	Save Exp Data Vector<expresult *>::reverse_iterator it=result.rbegin ();
	(*it)->correct=correct;
	(*it)->refuse=refuse;
	(*it)->guess_corr=uc;
	(*it)->tenum=samples.size (); (*IT)->tetime= (double) (T2-T1)/clocks_per_sec;
(*it)->corr_rate= ((float) (CORRECT+UC))/samples.size (); }


 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.