MC, MCMC, Gibbs Sampling principle & implementation (in R)

Source: Internet
Author: User

This article uses a random sampling method that specifies the distribution: MC (Monte Carlo), MC (Markov Chain), MCMC (Markov Chain Monte Carlo), and uses the R language to implement several examples:

1. Markov Chain (Markov chain)

2. Random Walk (stochastic walk)

3. MCMC Specific methods:

3.1 M-h method

3.2 Gibbs Sampling

PS: This blog is an ESE machine learning short class reference (20140516 course), detailed in class.

The following sections briefly describe the basic concepts and attach the code to the preceding points. Here the concept I will use the most naive words to summarize, the details on the most recommended link to my bottom (*^__^*)

0. MC (Monte Carlo)

Generates a sampling of random numbers for a specified distribution.

1. Markov Chain (Markov chain)

Assuming that f (t) is a time series, Markov chain is a stochastic process that assumes that F (t+1) is only related to f (t).

Implement in R:

[Python]View PlainCopy 
  1. #author: Rachel @ zju
  2. #email: [Email protected]
  3. N = 10000
  4. signal = vector (length = N)
  5. signal[1] = 0
  6. For (i in 2:n)
  7. {
  8. # Random Select one offset (from [ -1,1]) to Signal[i-1]
  9. Signal[i] = signal[i-1] + sample (C (-1,1),1)
  10. }
  11. Plot (Signal,type = ' l ', col = ' red ')





2. Random Walk (stochastic walk)

such as Brownian motion, just above Markov chain's two-dimensional expansion version:

Implement in R:

[Python]View PlainCopy 
  1. #author: Rachel @ zju
  2. #email: [Email protected]
  3. N =
  4. x = vector (length = N)
  5. y = vector (length = N)
  6. x[1] = 0
  7. y[1] = 0
  8. For (i in 2:n)
  9. {
  10. X[i] = x[i-1] + rnorm (1)
  11. Y[i] = y[i-1] + rnorm (1)
  12. }
  13. Plot (X,y,type = ' l ', col=' red ')



3. MCMC Specific methods:

The MCMC method was first given by Metropolis (1954), and later the metropolis algorithm was improved by Hastings, collectively known as the M-H algorithm. The M-H algorithm is the basic method of MCMC. Many new sampling methods have been developed by the M-H algorithm, including the most commonly used Gibbs sampling in MCMC, which can be seen as a special case of the M-H algorithm [2].

To sum up, MCMC based on this theory, in satisfying the "equilibrium equation" (Detailed balance equation) conditions, the MCMC can be transferred to the steady state through a very long transition.

"Equilibrium equation": Pi (x) * p (y|x) = Pi (y) * p (x|y) where Pi refers to distribution, P-refers to probability. This equation of equilibrium is the equilibrium that represents the probability of the condition (conversion probability) and the product of the distribution.

3.1 M-h method

1. Construct the target distribution, initialize the x0

2. In nth step, generate a new state from Q (y|x_n) y

3. Accept Y <ps with a certain probability ((Pi (y) * p (X_n|y))/(PI (x) * p (y|x_n)): Look at the equilibrium equation above, what does this probability mean? Refer here and [1]>

Implementation in R:

[Python]View PlainCopy 
  1. #author: Rachel @ zju
  2. #email: [Email protected]
  3. N = 10000
  4. x = vector (length = N)
  5. x[1] = 0
  6. # Uniform Variable:u
  7. U = runif (N)
  8. M_SD = 5
  9. Freedom = 5
  10. For (i in 2:n)
  11. {
  12. y = rnorm (1,mean = x[i-1],sd = m_sd)
  13. print (y)
  14. #y = RT (1,DF = Freedom)
  15. P_accept = Dnorm (x[i-1],mean = Y,SD = ABS (2*y+1))/Dnorm (y, mean = x[i-1],SD = ABS (2*x[i-1]+1 ))  
  16. #print (p_accept)
  17. if ((U[i] <= p_accept))
  18. {
  19. X[i] = y
  20. Print ("accept")
  21. }
  22. Else
  23. {
  24. X[i] = x[i-1]
  25. print ("reject")
  26. }
  27. }
  28. Plot (X,type = ' l ')
  29. Dev.new ()
  30. hist (x)




3.2 Gibbs Sampling

Nth time, Draw from, iterative sampling results close to real P (\theta_1, \theta_2, ...) that is, each time the other parameters are fixed, a parameter is sampled. For example, for a two-dollar normal distribution, the one-dimensional conditional distribution of its two components still satisfies the normal distribution:

Then the process of iterative sampling in Gibbs sampling is implemented as follows:

[HTML]View PlainCopy  
  1. #author: Rachel @ zju
  2. #email: [Email protected]
  3. #define GAUSS Posterior distribution
  4. P_ygivenx <-function (X,M1,M2,S1,S2)
  5. {
  6. Return (Rnorm (1,m2+rho*s2/s1* (X-M1), sqrt (1-rho^2) *s2))
  7. }
  8. P_xgiveny <-function (Y,M1,M2,S1,S2)
  9. {
  10. Return (Rnorm (1,m1+rho*s1/s2* (y-m2), sqrt (1-rho^2) *s1))
  11. }
  12. N =
  13. K = #iteration in each sampling
  14. X_res = vector (length = N)
  15. Y_res = vector (length = N)
  16. M1 = 10; m2 =-5; S1 = 5; s2 = 2
  17. Rho = 0.5
  18. y = m2
  19. For (i in 1:n)
  20. {
  21. For (i in 1:k)
  22. {
  23. x = P_xgiveny (y, m1,m2,s1,s2)
  24. y = p_ygivenx (x, m1,m2,s1,s2)
  25. # Print (x)
  26. X_res[i] = x;
  27. Y_res[i] = y;
  28. }
  29. }
  30. hist (x_res,freq = 1)
  31. Dev.new ()
  32. Plot (X_res,y_res)
  33. Library (MASS)
  34. Valid_range = seq (from= N/2, to = N, by = 1)
  35. Mvn.kdensity <-Kde2d (X_res[valid_range], Y_res[valid_range], h = ten) #估计核密度
  36. Plot (X_res[valid_range], Y_res[valid_range], col = "Blue", Xlab = "x", Ylab = "y")
  37. Contour (mvn.kdensity, add = TRUE) #二元正态分布等高线图
  38. #real distribution
  39. # real = Mvrnorm (n,c (m1,m2), Diag (C (S1,S2)))
  40. # dev.new ()
  41. # Plot (real[1:n,1],real[1:n,2])




X Distribution Chart:

(x, y) distribution map:

Reference:

1. Http://www2.isye.gatech.edu/~brani/isyebayes/bank/handout10.pdf

2. http://site.douban.com/182577/widget/notes/10567181/note/292072927/

3. book:http://statweb.stanford.edu/~owen/mc/

4. Classic:http://cis.temple.edu/~latecki/courses/robotfall07/papersfall07/andrieu03introduction.pdf

from:http://blog.csdn.net/abcjennifer/article/details/25908495

MC, MCMC, Gibbs Sampling principle & implementation (in R)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.