Huadian North Wind Blows
Key laboratory of cognitive computing and application, Tianjin University
Date: 2015/12/11
The Gaussian discriminant analysis belongs to the generative model, and the model finally learns a characteristic-class joint probability.
0 multi-dimensional normal distribution
To determine a multidimensional normal distribution only need to know the distributed mean Vector μ∈rnx1 \mu\in r^{n\times 1} and a covariance matrix Σ∈rnxn \sigma\in r^{n\times n}.
The probability density function is as follows:
P (x;μ,σ) =1 (2π) n/2|σ|1/2exp (−12 (x−μ) tσ−1 (x−μ)) (0) p (x;\mu,\sigma) =\frac{1}{(2\PI) ^{n/2}| \sigma|^{1/2}}exp (-\frac {1} {2} (X-\MU) ^t\sigma^{-1} (X-\MU)) \tag{0}
A Gaussian discriminant analysis
Scope of application: The input feature is continuous
Model statement:
Y∼bernoulli (Φ) (1-1) Y\sim Bernoulli (\phi) \tag{1-1}
X|y=0∼n (μ0,σ) () X|y=0\sim N (\mu_0,\sigma) \tag{}
X|y=1∼n (μ1,σ) () X|y=1\sim N (\mu_1,\sigma) \tag{}
With formula 0, you can write the formula 1-1 as:
P (Y) =ϕy (1−ϕ