Summary of machine learning problems

Source: Internet
Author: User
Tags svm

Category

Name

Keywords

Supervised Classification

Decision tree

Information Gain

Classification regression tree

Gini index, Gini 2 Statistics, pruning

Naive Bayes

Non-parameter estimation, Bayesian Estimation

Linear Discriminant Analysis

Fishre identification, feature vector Solution

K nearest

Similarity measurement: Euclidean distance, block distance, editing distance, vector angle, Pearson Correlation Coefficient

Logistic regression (binary classification)

Parameter Estimation (maximum likelihood estimation), S-type Function

Radial Basis Function Network

Non-parameter estimation, Regularization Theory, S-type Function

Dual Propagation Network

Competitive learning without mentors and widrow-hoff learning with mentors

Learning vector quantization Network

An output layer cell is connected to several competing layer cells.

Error reverse Propagation Network

S-type function and Gradient Descent Method

Support Vector Machine (binary classification)

Quadratic regulation, Laplace multiplier method, dual problem, optimization, minimum sequence optimization, kernel technique

Single Layer Sensor

Only linearly scalable

Double Hidden Layer Sensor

Sufficient to solve any complicated classification problems

Unsupervised classification

Kmeans

Centroid

Chamelone

Graph division, relative interconnection, and relative closeness

Birch

B tree, CF triple

DBSCAN

Core point, high density

EM algorithm (Gaussian Mixture Model)

Parameter Estimation (maximum likelihood estimation)

Spectral clustering

Graph division and singular value solving. Global convergence

Self-Organizing ing Network

Competitive learning without mentors

Regression Analysis

General Linear Regression

Parameter Estimation, least square method, generally used for prediction instead of classification

Logistic regression (binary classification)

Parameter Estimation (maximum likelihood estimation), S-type Function

Association Rule Mining

FP-tree

Frequent 1 item set, FP-tree, condition mode base, suffix Mode

Dimensionality Reduction

Principal Component Analysis

Covariance Matrix, Singular Value Decomposition

Recommendation

Collaborative Filtering

Similarity measurement of sparse Vectors

 

 

Method subdivision

Application site

Parameter Estimation

Maximum Likelihood Estimation

Linear regression. Assume that the error satisfies the normal distribution with the mean of 0 and converts it to the least square method.

Logistic regression. Gradient Descent Iteration Method for maximum values of likelihood Functions

Gaussian mixture model.

Non-Parameter Estimation

 

Radial Basis Function Network

Independence Test

No parameter hypothesis test

Chi-square test

Feature Word selection and termination condition of classification regression tree

Rank Sum Test

 

Correlation test

Pearson correlation coefficient (assuming that X and Y are obtained from the normal distribution in pairs)

Text Classification based on vector space model, user preference Recommendation System

Rank correlation coefficient (No parameter hypothesis test)

 

Optimization Method

Unrestricted Optimization Method

Gradient Descent Method

Maximum Likelihood Estimation (regression analysis and GMM)

SVM

Linear Discriminant Analysis

Newton Iteration Method and its variants

When there is a constraint, it is converted to a non-constraint problem by using the method of Laplace multiplier.

Feature value/feature vector

Power Method

Linear Discriminant Analysis

Dimensionality Reduction

Singular Value Decomposition (for symmetric matrices only)

Principal Component Analysis

Spectral clustering

Information Theory

Information Gain

Feature Word Selection

Decision tree

Mutual Information

Feature Word Selection

Cross entropy

Feature Word selection, rare event modeling and simulation, multi-peak Optimization

Core functions

Polynomial kernel functions

SVM

RBF Network

Gaussian Kernel Function (Radial Basis Function)

Bipolar Kernel Function

Single polarity Sigmoid Function

Logistic Regression

BP Neural Network

Covariance

Pearson Correlation Coefficient

PCA

EM Algorithm

Gaussian Mixture Model

Forward and backwardAlgorithm

Basic functions

Gaussian Mixture Model

Radial Basis Function Network

Smoothing Algorithm

Laplace Smoothing

Bayesian Classification

Hidden Markov Model

Good-Turing smooth

Hidden Markov Model

Evaluation problem-Forward Algorithm

 

Decoding-Viterbi Algorithm

Chinese Word Segmentation and part-of-speech tagging

Learning problems-baumwelch Algorithm

 

The Cover Theorem points out that nonlinear ing of complex pattern classification problems to a high-dimensional space is more likely to be linearly partitioned than projected to a low-dimensional space. Therefore, SVM and RBF Networks both try to map samples from low-dimensional space to high-dimensional space for classification.

It is funny that another method is to reduce input samples from high dimensions to low dimensions before classification or regression analysis, such as PCA, SOFM network, Lda, and spectral clustering, they believe that the sample has a clearer expression in the low-dimensional feature space, making it easier to discover patterns.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.