Simple C + + implementation of Metropolis algorithm and MATLAB implementation

Source: Internet
Author: User
Tags pow

Metropolis is a sampling method that is generally used to obtain some samples that have some relatively complex probability distributions.
1. The most basic sampling is the generation of random numbers, which is usually generated with evenly distributed random numbers, such as the RAND function in C + +, which can be used directly.
2. The complexity of sampling is the conversion of hair, used to generate common distribution, such as Gaussian distribution, it is generally through the uniform distribution or the existing distribution of the transformation of the formation, such as Gaussian distribution can be generated by the following methods:
Order: The u,v is a uniformly distributed random variable;
Then: gauss=sqrt ( -2lnu) cos (2*pi*v).
(Detailed reference Box-muller)
Like this Gaussian distribution Matelab can be generated directly with RANDN.


3. The last method is the class MCMC method, in fact, MCMC is a kind of architecture, here we say one of the simplest metropolis method, its core idea is to build Markov chain, each node of the chain becomes a sample, using the Markov chain transfer and acceptance probability, In a certain way to the sample generation, this method is simple and effective, the specific reference to "LDA math gossip", which has been introduced in detail. This article gives a small demo, which is written using C + +. The MATLAB version of the exception code also has, but not I write, together, the MATLAB drawing function can be visually verified sampling validity, (as if Matlab has a small bug),


The probability distribution of the sample to be generated is: POW (t,-3.5) *exp ( -1/(2*t)) (csdn do you want Latex that way of editing ah?? ), here in Metropolis assume that the Markov chain is symmetric, which is P (x|y) =p (y|x), the entire code is as follows:
#include <iostream> #include <time.h> #include <stdlib.h> #include <cmath>using namespace std;# Define DEBUG (A) std::cout<< #A << ":" <<a<<std::endl;class mcmcsimulation{public:int n;int nn; float* sample; Mcmcsimulation () {this->n=1000;this->nn=0.1*n;this->sample=new float[n];this->sample[0]=0.3;} void Run () {for (int i=1;i<this->n;i++) {float p_sample=this->pdf_proposal (3); float alpha=pdf (p_sample)/pdf ( THIS-&GT;SAMPLE[I-1]); if (Pdf_proposal (1) < This->min (alpha,1)) {this->sample[i]=p_sample;} ELSE{THIS-&GT;SAMPLE[I]=THIS-&GT;SAMPLE[I-1];}//debug (P_sample);//debug (Pdf_proposal (1));//debug (Sample[i]); Cout<<sample[i]<<endl;}} private://the probability desity functionfloat PDF (float t) {return pow (t,-3.5) *exp ( -1/(2*t));}//generate a Rand number F Rom 0-max, with a specified precisionfloat pdf_proposal (int MAX) {//srandreturn (float) (MAX) * (float) (rand ())/(float) ( Rand_max);} float min (float x, float y) {if (x<y) {return X }else{return y;}}; int main () {mcmcsimulation* sim =new mcmcsimulation (); Sim->run ();d elete Sim;return 1;}

MATLAB version

Metropolis algorithm example%{-------------------------------------------------------------------------- -*created by:Date:Comment:Felipe Uribe-castillo July Final Project Algor Ithm *mail: [email protected]*university:national University of Colombia at Manizales. Civil Engineering Dept.---------------------------------------------------------------------------the Metropolis Algorithm:first MCMC to obtain samples from some complex probability distributionand to integrate very complex functions B Y random sampling. It considers only symmetric proposals (proposal PDF). Original contribution:-"The Monte Carlo method" Metropolis & Ulam (1949).-"Equations of state calculations by Fast C Omputing machines "Metropolis, Rosenbluth, Rosenbluth, Teller & Teller (1953).------------------------------------ ---------------------------------------Based on:1. " Markov chain Monte Carlo and Gibbs sampling "B.WALSH-----LectureNotes for EEB 581, version April 2004. Http://web.mit.edu/~wingated/www/introductions/mcmc-gibbs-intro.pdf%}clear; Close all; Home                                       Format Long g;%% Initial data% distributions nu = 5;   % Target PDF parameterp = @ (t) (t.^ (-nu/2-1)). * (exp ( -1./(2*t)));                      % Target "PDF", Ref. 1-ex. 2proposal_pdf = @ (mu) unifrnd (0,3);              % proposal pdf% PARAMETERSN = 1000;           % number of samples (iterations) NN = 0.1* (N);        % number of samples for examine the Autocorrtheta = zeros (1,n);               % Samples drawn form the target "PDF" theta (1) = 0.3;          % Initial State of the chain%% procedurefor i = 1:n theta_ast = Proposal_pdf ([]);  % sampling from the proposal PDF Alpha = P (theta_ast)/P (Theta (i)); % Ratio of the density at Theta_ast and theta points if Rand <= min (alpha,1)% Accept the sample with prob = min   (alpha,1) theta (i+1) = Theta_ast; Else%Reject the sample with prob = 1-min (alpha,1) theta (i+1) = theta (i);   endend% autocorrelation (AC) pp = theta (1:nn);   Pp2 = Theta (end-nn:end); % first ans last nn samples[r lags] = Xcorr (Pp-mean (PP), ' Coeff ');   [R2 LAGS2] = Xcorr (Pp2-mean (PP2), ' Coeff '), percent plots% Autocorrelationfigure;subplot (2,1,1);   Stem (lags,r), title (' autocorrelation ', ' FontSize ', +), Ylabel (' AC (first samples) ', ' FontSize ', ' n '); subplot (2,1,2);   Stem (LAGS2,R2), Ylabel (' AC (last samples) ', ' FontSize ', ' n '),% samples and distributionsxx = Eps:0.01:10; % X-axis (Graphs) figure;% histogram and Target Distsubplot (2,1,1);   [N1 x1] = hist (theta, Ceil (sqrt (N))), Bar (x1,n1/(n (2) x1 (1)));   ColorMap Summer;   Hold on;        % normalized histogramplot (xx, P (xx)/trapz (xx,p (XX)), ' R ', ' LineWidth ', 3);   % normalized functiongrid on;   Xlim ([0 max (theta)]); Title (' Distribution of samples ', ' FontSize ', +); Ylabel (' probability density function ', ' FontSize ', 12);%     Samplessubplot (2,1,2); Plot (theta, 1:n+1, ' B-');   Xlim ([0 max (theta)]);   Ylim ([0 N]); Grid On;xlabel (' Location ', ' FontSize ', '), Ylabel (' Iterations, N ', ' FontSize ', 12); %%end
Here's a sample distribution map.


Simple C + + implementation of Metropolis algorithm and MATLAB implementation

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.