% LMS algorithm demo (MatLab) % set parameters, n is the number of samples, U is the step clear, CLC; n = 16; u = 0.1; % set the number of iterations KK = 1000; % PHA: mean power of random noise rk = randn (1, k)/2; % random matrix PHA = mean (rk) of normal distribution ); % calculate the average element % set the starting weight value wk (1, :) = [0 0]; % use the LMS algorithm to calculate the optimal weight for I = 1: k xk (I, :) = [sin (2 * pI * I/n) sin (2 * pI * (I-1)/n)] + rk (I); % input signal YK (I) = XK (I, :) * wk (I, :) '; % output signal DK (I) = 2 * Cos (2 * pI * I/N ); % expected signal err (I) = dk (I)-YK (I); % error wk (I + 1, :) = wk (I, :) + 2 * u * err (I) * XK (I, :); % weight iterations end [x, y] = meshgrid.. 1-0]); % evaluate the performance surface z = (0.5 + PHA) * (X. ^ 2 + Y. ^ 2) + X. * y * Cos (2 * PI/n) + 2 * y * sin (2 * PI/n) + 2; % calculate the theoretical optimal weight X1, y1x1 = 2 * Cos (2 * PI/n) * sin (2 * PI/N)/(1 + PHA) ^ 2-(COS (2 * PI/n) ^ 2); Y1 =-2*(1 + 2 * PHA) * sin (2 * PI/N) /(1 + PHA) ^ 2-(COS (2 * PI/n) ^ 2); % draw the contour line figure, contour (X, Y, Z, [0.78 1.9 6.3 13.6 37]); % equivalent line chart % changes in the weight during iteration hold on; plot (WK (:, 1), wk (:, 2 ), 'R'); % Hold on; plot (x1, Y1, '*'); % plot the error and number of iterations, plot (ERR );
Other people's programs, good results. The principle and code can be better understood.
Refer:
1. http://zhidao.baidu.com/question/53628331
2. http://www.cnblogs.com/LeftNotEasy/archive/2010/12/05/mathmatic_in_machine_learning_1_regression_and_gradient_descent.html