Summary of the method of "turning" machine learning problem

Source: Internet
Author: User
Tags svm

A summary of machine learning problem methods

Big class

Name

Keywords

Supervised classification

Decision Tree

Information gain

Categorical regression Tree

Gini index, χ2 statistic, pruning

Naive Bayesian

Non-parametric estimation, Bayesian estimation

Linear discriminant Analysis

Fishre discriminant, feature vector solution

K Nearest Neighbor

Similarity measurement: Euclidean distance, block distance, editing distance, vector angle, Pearson correlation coefficient

Logistic regression (two-value classification)

Parameter estimation (maximum likelihood estimation), S-type function

Radial basis function Network

Nonparametric estimation, regularization theory, S-type function

Dual propagation networks

Competitive learning without mentors, Widrow-hoff learning with mentors

Learning Vector Quantization Network

An output layer cell is connected to several competing layers of cells.

Error Reverse Propagation Network

S-type function, gradient descent method

Support Vector Machines (two-value classification)

Two-time regulation, Lagrange multiplier method, dual problem, optimization, Sequence minimization optimization, nuclear skills

Single-Layer Perceptron

Only the ability to be linearly divided

Dual hidden layer Perceptron

Enough to solve any complex classification problem

Unsupervised classification

Kmeans

Centroid

Chamelone

Graph partitioning, relative interconnection, relative tightness

BIRCH

B-Tree, CF ternary group

DBScan

Core point, density up to

EM algorithm (Gaussian mixture model)

Parameter estimation (maximum likelihood estimation)

Spectral clustering

Graph division, singular value solution. Global Convergence

Self-Organizing Map Network

Competitive learning without a mentor

Regression analysis

General linear Regression

Parameter estimation, least squares, generally not used for classification but for prediction

Logistic regression (two-value classification)

Parameter estimation (maximum likelihood estimation), S-type function

Mining Association Rules

Fp-tree

Frequent 1 itemsets, fp-tree, conditional pattern base, suffix mode

Dimension reduction

Principal component Analysis

Covariance matrix, singular value decomposition

Recommended

Collaborative filtering

Similarity measure of sparse vectors

Method subdivision

Application Sites

Parameter estimation

Maximum likelihood estimation

Linear regression. Assuming that the error satisfies a normal distribution with a mean of 0, which translates to the least squares

Logistic regression. The extremum of the likelihood function by the gradient descent iterative method

Gaussian mixture model.

Non-parametric estimation

Radial basis function Network

Independence test

Non-parametric hypothesis test

Χ2 Inspection

Feature word selection and the termination condition of categorical regression tree

Rank and test

Correlation test

Pearson correlation coefficient (assuming that x, Y is obtained from the normal distribution)

Text categorization based on vector space model, user Preferences recommendation system

Spearman rank correlation coefficient (no parameter hypothesis test)

Optimization method

Unconstrained optimization Method

Gradient Descent method

Maximum likelihood estimation (regression analysis, GMM)

Support Vector Machine

Linear discriminant Analysis

Newton iterative method and its variants

Conversion to unconstrained problem by Lagrange multiplier method when constrained

Finding eigenvalues/Eigenvectors

Power method

Linear discriminant Analysis

Dimension reduction

Singular value decomposition (for symmetric matrices only)

Principal component Analysis

Spectral clustering

Information

Information gain

Feature word Selection

Decision Tree

Mutual information

Feature word Selection

Cross Entropy

Feature word selection, modeling and simulation of rare events, multi-peak optimization problems

Kernel function

Polynomial kernel functions

Svm

RBF Network

Gaussian kernel function (radial basis function)

Bipolar Kernel function

unipolar sigmoid function

Logistic regression

BP Neural network

Covariance

Pearson correlation coefficient

Pca

EM algorithm

Gaussian mixture model

Forward Backward algorithm

Base function

Gaussian mixture model

Radial basis function Network

Smoothing algorithm

Laplace smoothing

Bayesian classification

Hidden Markov model

Good-turing Smoothing

Hidden Markov model

Evaluate problem-forward algorithm

-viterbi algorithm for decoding problem

Chinese word segmentation, pos tagging

-baumwelch Algorithm for learning problem


The cover theorem points out that the nonlinear mapping of complex pattern classification problems to high-dimensional spaces is more likely to be linearly divided than projected into low-dimensional spaces. So both SVM and RBF network try to map samples from low-dimensional space to high-dimensional space and then classify them.

The funny thing is, the other method is to reduce the input samples from high-dimensional to low-dimensional classification or regression analysis, such as PCA, SOFM Network, LDA, spectral clustering, they believe that the sample in the low-dimensional feature space has a clearer expression, easier to find the law.

Reference http://www.cnblogs.com/zhangchaoyang/archive/2012/08/28/2660929.html

Summary of the method of "turning" machine learning problem

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.