# LMS Algorithm Adaptive Filter

Source: Internet
Author: User
Tags scalar

Directory

1. Introduction to Adaptive Filters

2. Adaptive filtering Noise Cancellation principle

3. LMS Algorithm principle

4, MATLAB implementation

4.1, Lmsfliter ()

4.2, Lmsmain ()

5. Analysis of results

1. Introduction to Adaptive Filters

Adaptive filtering, is to use the results of the filter parameters obtained in the previous moment, automatically adjust the current moment of the filter parameters to adapt to the signal and noise unknown or time-varying statistical characteristics, so as to achieve optimal filtering. The adaptive filter is essentially a Wiener filter which can adjust its transmission characteristics to achieve the optimal. The adaptive filter does not need prior knowledge about the input signal, the computational amount is small, especially suitable for real-time processing. The Wiener filter parameters are fixed and suitable for stationary random signals. The parameters of Kalman filter are time-varying and suitable for nonstationary random signals. However, the two filtering techniques can obtain optimal filtering only if the statistical characteristics of the signal and noise are known prior to the first. Prior knowledge of the statistical characteristics of signals and noises is often not available in practical applications. In this case, the adaptive filtering technology can get excellent filtering performance, so it has good application value.

The commonly used adaptive filtering techniques are: Minimum mean square (LMS) adaptive filter, recursive least squares (RLS) filter lattice filter and infinite impulse response (IIR) filter. The applications of these adaptive filtering techniques include adaptive noise cancellation, adaptive spectral line enhancement, and notch trapping.

2. Adaptive filtering Noise Cancellation principle

Figure 1 shows the basic principle of using adaptive noise cancellation technique to solve the problem of signal extraction in noisy background. The main input receives the signal from the signal source s but is disturbed by the noise source to receive the noise VO. The reference signal for the reference input is VI, a noise signal that is independent of the useful signal s but is related to VO. The main input contains the additive noise to be offset, and the reference input is aligned with the noise VO in the main input. Using the correlation of two input noises and the independence of signal and noise, the reference input is approximated and subtracted from the noise component in the main input, and the output error signal is obtained. The adaptive filtering algorithm determines the processing of the reference signal V1 by the filter, so that the output of the filter is as close as possible to the interference component in the main input. Therefore, in the best sense, the output v approximation Vo of the filter is equivalent to the output e approximation S of the system. The signal-to-noise ratio is greatly improved in the output of the noise canceller. However, if the reference channel, in addition to the detection of noise v1, also received signal components, the output of the adaptive filter will contain the signal component, so that the noise cancellation effect is bad. Therefore, in order to obtain a good noise cancellation performance, the reference channel should be detected as small as possible, the signal is not measured in the noisy environment to pick up the reference input signal.

3. The LSM Algorithm principle

4, MATLAB function realization

4.1, Lmsfilter (XN,DN,M,MU)

`%Input Parameters:%xn input signal sequence (column vector)%the expected response sequence of the DN (column vector)%the order of the M filter (scalar)%mu Convergence Factor (step) (scalar) requirement is greater than 0, less than the reciprocal of the maximum eigenvalue of the correlation matrix of xn%Output Parameters:%the weight matrix (matrix) of the W filter%The size is m x ITR,% en error sequence (ITR X1) (column vector)%yn actual output sequence (column vector) function [Yn,w,en]=Lmsfilter (XN,DN,M,MU) ITR=length (xn); En= Zeros (ITR,1); %The error sequence, en (k), indicates the error of the expected output and the actual input at the K-Iteration W= Zeros (M,ITR); % each row represents a weighted parameter, and each column represents-iterations, initially 0%Iterative Calculations forK = m:itr%k Iteration x= xn (k:-1: k-m+1); %Input y for filter M-Taps= W (:, K-1).'* x; The output of the% filterEn (k) = DN (k)-y; %error of the K-th iteration%Iterative Formula W (:, k) for filter weight calculation= W (:, K-1) +2*mu*en (k) *X;end%for optimal time filter output sequence R If there is no yn return parameter can not be the following yn= inf * ones (size (xn)); %inf is the meaning of Infinity forK =m:length (xn) x= xn (k:-1: k-m+1); Yn (k)= W (:, End).'* The x;% is output with the best estimate obtained lastEnd`

4.2, Filtermain ()

`clc;clear all;close all;%%generating a signal source [x,fs,bits]= Wavread ('C:\Users\mahl\Desktop\ modern signal processing \MAHLDSP\LMS\WAV\XP boot. wav'); s= X (:,1); %remove one of the channels in the dual channel as the signal source Swavwrite (S,fs,bits,'C:\Users\mahl\Desktop\ modern signal processing \mahldsp\lms\wav\ RAW Audio'); %Create the original audio. Wavn=length (s); t=(0: N-1); figure (1); subplot (4,1,1);p lot (t,s); Grid;ylabel ('amplitude'); Xlabel ('Time'); Title ('Raw Audio Signal');%%generates a noise signal with a mean of 0 variance of 0.1 v= sqrt (0.1) *randn (N,1);%%AR model-generated noise AR=[1,1/2]; %ar model V_ar=filter (1, ar,v);% subplot (4,1,2);%plot (t,v_ar); grid;% Ylabel ('amplitude');% Xlabel ('Time');% Title ('AR model Noise signal');%%The noise generated by the MA model is the correlated noise ma of the AR model=[1,-0.8,0.4,-0.2]; %MA model V_ma=filter (MA,1, V); Subplot (4,1,2);p lot (t,v_ma); Grid;ylabel ('amplitude'); Xlabel ('Time'); Title ('Correlated Noise Signal');%%generate expected signal DN= S +V_ar; Wavwrite (Dn,fs,bits,'C:\Users\mahl\Desktop\ modern signal processing \mahldsp\lms\wav\ with noise frequency'); %Create a noise-containing frequency subplot (4,1,3);p lot (T,DN); Grid;ylabel ('amplitude'); Xlabel ('Time'); Title ('audio signal with noise');%%LMS filter Algorithm M=Ten; %Order of Filters MMU=0.0005; %step size of the filter [Ylms,w,elms]=Lmsfilter (V_MA,DN,M,MU);%%draw the voice signal after denoising subplot (4,1,4);p lot (t,elms); Grid;ylabel ('amplitude'); Xlabel ('Time'); Title ('audio signal after noise removal'); Wavwrite (Elms,fs,bits,'C:\Users\mahl\Desktop\ modern signal processing \mahldsp\lms\wav\ noise frequency');%Save noise-removing audio%%e= s-elms;%residual noise figure (2); subplot (2,1,1);p lot (t,e); Grid;ylabel ('amplitude'); Xlabel ('Time'); Title ('residual noise');%%a small segment of three signals compare subplot (2,1,2); t=(50000:50500);p lot (T,elms (50000:50500,1),'R', T,e (50000:50500,1),'g', T,s (50000:50500,1),'b'); axis ([50000,50500,-1,1]); Ylabel ('amplitude'); Xlabel ('Time'); Legend ('Voice signal after noise removal','residual noise','Raw Audio'); Title ('a small segment of three signal comparisons');`

5. Analysis of results

As we can see from Figure 2, although the real audio signal is mixed with strong noise, even the noise drowns out the real signal, the real signal can be recovered well after our LMS adaptive filter. So the adaptive filter has good denoising performance. From the analysis of Figure 3, Figure 4, Figure 5, although the convergence time t gradually decreases as the step size u increases, the audio signal is small at U, because the convergence is slow and the noise is mixed, but when u increases, the audio signal becomes clearer. But when the step exceeds a certain value, the audio starts to be noisy again, then increases the step u, even the audio signal is filtered out. So if the value of the U is too small, the convergence speed will be too slow, when taken too large, it will cause the system convergence instability, resulting in divergence. So choosing the best U-value is an important problem in LMS algorithm.

LMS Algorithm Adaptive Filter

Related Keywords:
Related Article

## E-Commerce Solutions

Leverage the same tools powering the Alibaba Ecosystem

## Apsara Conference 2019

The Rise of Data Intelligence, September 25th - 27th, Hangzhou, China

## Alibaba Cloud Free Trial

Learn and experience the power of Alibaba Cloud with a free trial worth \$300-1200 USD