Gaussian mixture model of image processing

Source: Internet
Author: User
Tags abs arrays int size

Gaussian mixture model of image processing

One: Overview

Gaussian mixture model (GMM) has been used in image segmentation, object recognition, video analysis and so on, for any given set of data samples, according to its distribution probability, we can calculate the probability distribution of each sample data vector, and then classify it according to the probability distribution, but these probability distributions are mixed together, To isolate the probability distribution of a single sample, the sample data is clustered, and the probability distribution describes that we can use the Gaussian function, which is the Gaussian mixture model-GMM.



This method, also known as D-em, is the maximization of distance-based expectations.


Three: Algorithm steps

1. Initialize variable definition-Specify the number of clusters K and data dimension D

2. Initialize mean, covariance, prior probability distributions

3. Iterative e-m Steps

-e-Step calculation of expectations

-M-Step update mean, covariance, prior probability distributions

-Detects if the stop condition is reached (maximum number of iterations and minimum error is met), exit iteration, or continue e-m step

4. Print Final classification results


Four: Code implementation

Package com.gloomyfish.image.gmm;
Import java.util.ArrayList;
Import Java.util.Arrays;

Import java.util.List;
	/** * * @author gloomy fish * * */public class Gmmprocessor {public final static double min_var = 1E-10;
	public static double[] Samples = new DOUBLE[]{10, 9, 4, 23, 13, 16, 5, 90, 100, 80, 55, 67, 8, 93, 47, 86, 3};
	private int dimnum;
	private int mixnum;
	Private double[] weights;
	Private double[][] M_means;
	Private double[][] M_vars;

	Private double[] M_minvars; /*** * * @param m_dimnum-The dimensions of each sample data, RGB three vectors for each pixel of the image * @param m_mixnum--need to be divided into several parts, i.e. the number of Gaussian models in the Gaussian mixture model */Publi
		c gmmprocessor (int m_dimnum, int m_mixnum) {dimnum = M_dimnum;
		Mixnum = M_mixnum;
		weights = new Double[mixnum];
		M_means = new Double[mixnum][dimnum];
		M_vars = new Double[mixnum][dimnum];
	M_minvars = new Double[dimnum];
		}/*** * Data-required to be processed * @param/public void process (double[) {int m_maxiternum = 100;
		
		Double err = 0.001; Boolean loop = true;
		Double iternum = 0;
		Double lastl = 0;
		Double currl = 0;
		
		int unchanged = 0;
		
		Initparameters (data);
		int size = Data.length;
		double[] x = new Double[dimnum];
		double[][] Next_means = new Double[mixnum][dimnum];
		double[] next_weights = new Double[mixnum];
		double[][] Next_vars = new Double[mixnum][dimnum];

		list<datanode> cList = new arraylist<datanode> ();
			while (loop) {Arrays.fill (next_weights, 0);
			Clist.clear ();
				for (int i=0; i<mixnum; i++) {Arrays.fill (Next_means[i], 0);
			Arrays.fill (Next_vars[i], 0);
			} Lastl = CURRL;
			CURRL = 0;
				for (int k = 0, k < size; k++) {for (int j=0;j<dimnum;j++) X[J]=DATA[K*DIMNUM+J]; Double p = getprobability (x);
				The total probability density distribution DataNode DN = new DataNode (x);
				Dn.index = k;
				Clist.add (DN);
				Double MAXP = 0; for (int j = 0; J < Mixnum, J + +) {Double PJ = getprobability (x, j) * Weights[j]/p;//percentage of probability density distribution per classification I
F (Maxp < PJ) {						Maxp = PJ;
					Dn.cindex = j; } Next_weights[j] + = PJ;
						Get a posteriori probability for (int d = 0; d < dimnum; d++) {next_means[j][d] + = PJ * X[d];
					Next_vars[j][d] + = pj* x[d] * x[d]; }} currl + = (p > 1E-20)?
			MATH.LOG10 (P):-20;
			
			} currl/= size;
			Re-estimation:generate new weight, means and variances.
	
				for (int j = 0; J < Mixnum; J + +) {Weights[j] = next_weights[j]/size; if (Weights[j] > 0) {for (int d = 0; d < dimnum; d++) {m_means[j][d] = Next_means[j][d]/Next
						_WEIGHTS[J];
						M_VARS[J][D] = Next_vars[j][d]/next_weights[j]-m_means[j][d] * m_means[j][d];
						if (M_vars[j][d] < m_minvars[d]) {m_vars[j][d] = m_minvars[d];
			}}}}//Terminal conditions iternum++;
			if (Math.Abs (CURRL-LASTL) < err * Math.Abs (LASTL)) {unchanged++; } if (Iternum >= m_maxiternum | | unchanged >= 3) {loop = FALse
		}}//Print result System.out.println ("================= final result =================");
				for (int i=0, i<mixnum; i++) {for (int k=0; k<dimnum; k++) {System.out.println ("[" + i + "]:");
				System.out.println ("means:" + m_means[i][k]);
				System.out.println ("var:" + m_vars[i][k]);
			System.out.println ();  }}//Get classification for (int i=0; i<size; i++) {System.out.println ("data[" + i + "]=" + data[i] + "CIndex:" +
		Clist.get (i). CIndex); }}/** * * @param data */private void Initparameters (double[] data) {//random method initialization mean int size = Data.le
		Ngth; for (int i = 0, i < mixnum; i++) {for (int d = 0; d < dimnum; d++) {M_means[i][d] = data[(int) (Math.ra
			Ndom () *size)];
		}}//According to the mean to get the classification int[] types = new Int[size];
			for (int k = 0; k < size; k++) {double max = 0;
				for (int i = 0; i < Mixnum; i++) {double v = 0; for (int j=0;j<dimnum;j++) {v + = Math.Abs (data[k*dimnum+J]-m_means[i][j]);
					} if (V > max) {max = V;
				Types[k] = i;
		}}} double[] counts = new Double[mixnum];
		for (int i=0; i<types.length; i++) {counts[types[i]]++;
		}//Calculate a prior probability weight for (int i = 0; i < Mixnum; i++) {weights[i] = counts[i]/size;
		}//Calculates the variance of each category int label =-1;
		int[] Label = new Int[size];
		double[] Overmeans = new Double[dimnum];
		double[] x = new Double[dimnum];
			for (int i = 0, i < size; i++) {for (int j=0;j<dimnum;j++) X[J]=DATA[I*DIMNUM+J];

			Label=label[i];
			Count each Gaussian counts[label]++; for (int d = 0; d < dimnum; d++) {m_vars[label][d] + = (X[d]-m_means[types[i]][d]) * (X[d]-m_means[types[i]
			[d]);
			}//Count The overall mean and variance.
				for (int d = 0; d < dimnum; d++) {overmeans[d] + x[d];
			M_minvars[d] + = x[d] * X[d];
		}}//Compute the overall variance (* 0.01) as the minimum variance. for (int d = 0; d < diMnum;
			d++) {overmeans[d]/= size;
		M_minvars[d] = Math.max (Min_var, 0.01 * (M_minvars[d]/size-overmeans[d] * overmeans[d]));
		}//Initialize each Gaussian.
					for (int i = 0; i < Mixnum, i++) {if (Weights[i] > 0) {for (int d = 0; d < dimnum; d++) {

					M_VARS[I][D] = M_vars[i][d]/counts[i];
					A minimum variance for each dimension is required.
					if (M_vars[i][d] < m_minvars[d]) {m_vars[i][d] = m_minvars[d];
		}}}} System.out.println ("================= initialization =================");
				for (int i=0, i<mixnum; i++) {for (int k=0; k<dimnum; k++) {System.out.println ("[" + i + "]:");
				System.out.println ("means:" + m_means[i][k]);
				System.out.println ("var:" + m_vars[i][k]);
			System.out.println (); }}}/*** * * @param sample-sampled data points * @return Total probability density distribution probability of the point */public double getprobability (double[] Sam
		ple) {Double p = 0;
for (int i = 0; i < Mixnum; i++) {			p + = weights[i] * getprobability (sample, I);
	} return p; 
	 }/** * Gaussian Model-PDF * @param x-Represents the sampled data point vector * @param J-Indicates the probability density distribution of the corresponding J-category * @return-return probability density distribution possibility value
		*/Public double getprobability (double[] x, int j) {Double p = 1;
			for (int d = 0; d < dimnum; d++) {P *= 1/math.sqrt (2 * 3.14159 * m_vars[j][d]);
		P *= Math.exp ( -0.5 * (X[d]-m_means[j][d]) * (X[d]-m_means[j][d])/m_vars[j][d]);
	} return p;
		} public static void Main (string[] args) {gmmprocessor filter = new Gmmprocessor (1, 2);
		
	Filter.process (samples);
 }
}
Structural class Datanode

Package com.gloomyfish.image.gmm;

public class DataNode {public
	int cindex;//cluster public
	int index;
	Public double[] value;
	
	Public DataNode (double[] v) {
		this.value = v;
		CIndex =-1;
		index =-1;
	}
}

V: Results

Here the method of initial center mean is realized by random number, the result of GMM algorithm is very related to initialization, and the method of common initialization center point is to calculate the center point by K-means. You can try to modify the code based on the K-means initialization parameters, I chose the random parameters initially, mainly for the sake of convenience.


Do not hype concept, only share dry goods.

Please continue to follow this blog.






Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.