A brief talk on compression perception (24): SubSpace tracing of compression-aware reconstruction algorithm (SP)

Source: Internet
Author: User

Main content:

    1. Algorithm flow for SP
    2. The MATLAB implementation of SP
    3. Experiment and result of one-dimensional signal
    4. Experiment and result of the relationship between the measured number m and the probability of the reconstruction success
    5. Performance comparison of SP and Cosamp
First, the SP's algorithm flow

The compression sample matching trace (Cosamp) is almost identical to the subspace Trace (SP), so the algorithm flow is basically the same.

The main difference between SP and Cosamp is "Ineach iteration, in the SP algorithm, only K new candidates is added, while Thecosamp algorithm adds 2K Vectors. ", that is, the SP chooses K atoms at a time, and Cosamp chooses 2K atoms, so the benefit is" This makes the SP algorithm computationally moreefficient, ".

Algorithm flow for SP:

This algorithm process initialization (initialization) is similar to the cosamp of the 1th iteration, note that step (1) Selected K atoms: "K indices corresponding to the largest magnitude Entries ", in Cosamp here to select the 2K largest atom, the rest of the process is the same. Step (5) Here adds a condition for stopping the iteration: when the residuals have become larger after iteration, they stop iterating.

The concrete algorithm steps and the Elementary talk about the compression perception (23): Compression-aware reconstruction algorithm compression sampling matching tracking (cosamp) Consistent, only need to change step (2) 2K to K.

"Greedy class algorithm, although the complexity of low speed, but its reconstruction accuracy is not as good as the BP algorithm, in order to find a better compromise of complexity and precision, SP algorithm came into being", "SP algorithm and Cosamp algorithm as its basic idea is to borrow the idea of backtracking, To re-estimate the trustworthiness of all candidates during each iteration, the SP algorithm has similar properties and advantages and disadvantages to the cosamp algorithm.

Second, the SP's MATLAB implementation (CS_SP.M)
function [theta] =cs_sp (y,a,k)%cs_sp%detailed explanation goes here% y = Phi *x% x = Psi *Theta% y = phi*psi *Theta% Order A = phi*psi, then y=a*Theta% K isThe sparsity level%y and A are now known for Theta% Reference:dai W,milenkovic O. SubSpace Pursuit forCompressive Sensing%Signal Reconstruction[j]. IEEE Transactions on information theory,% the, -(5):2230-2249. [M,n]=size (y); ifm<N y= y';%y should be a column vectorend [M,n]= Size (A); % sensor matrix A is m*n Matrix Theta= Zeros (N,1); %the theta (column vector) used to store the recovery pos_num= []; %used during iteration to store a selected column ordinal res= y; %initialization residuals (residual) are Y forkk=1Kiterate up to K times%(1) Identification Product= A'*res;% of the sensor matrix A each column and the residual error of the inner product[Val,pos]=sort (ABS (product),'Descend'); Js= POS (1: K); %Select the k column with the largest inner product value%(2) Support merger is= Union (POS_NUM,JS); %Pos_theta with JS and set%(3) Estimation%At the number of rows is greater than the number of columns, this is the basis of least squares (column linearly independent)ifLength (IS) <=M at= A (:, is); %These columns of A are composed of a matrix atElse%at the number of columns is greater than the number of rows, the column must be linearly related, at'*at will not be reversible             Break; %Jump out of the for Loop end%y=at*Theta, the least squares solution for theta (Least Square) Theta_ls= (at'*at) ^ ( -1) *at'*y; %Least Squares solution%(4) pruning [Val,pos]=sort (ABS (Theta_ls),'Descend'); %(5) Sample Update pos_num= Is (POS (1: K)); Theta_ls= Theta_ls (POS (1: K)); %at (:, POS (1: K)) *theta_ls is Y at (:, POS (1: K)) Orthogonal projection res on column space= Y-at (:, POS (1: K)) *theta_ls; %Update residualsifNorm (Res) <1e-6%repeat the steps until r=0             Break; %jump out for Loop end end Theta (pos_num)=theta_ls; %Recovery of Thetaend
Experiment and result of three or one-D signal
%compression-aware reconstruction algorithm test clear all;close ALL;CLC; M= -; %Number of observations n= the; %length of signal x K= A; %sparse degree of signal x index_k=randperm (N); x= Zeros (N,1); X (Index_k (1: K)) =5*randn (K,1); %x is k sparse, and the position is random psi= Eye (N); The%x itself is sparse, and the sparse matrix is defined as the unit array x=psi*Thetaphi= Randn (m,n); %The measurement matrix is a Gaussian matrix a= Phi * PSI; %Sensing Matrix Y= Phi * x; %get the observation vector y%%restoring the reconstructed signal Xtictheta=cs_sp (y,a,k); X_r= Psi * THETA; % X=PSI *Thetatoc%%Drawing Figure;plot (X_r,'k.-'); %draw the recovery signal of x hold On;plot (x,'R'); %draw the original signal Xhold Off;legend ('Recovery','Original') fprintf ('\ nthe recovery residuals:'); Norm (X_r-X)% recovery residuals

Experiment and result of the relationship between measurement number m and the probability of reconstruction success
clear All;close ALL;CLC;%%parameter configuration Initialize CNT= +; %for each group (K,M,N), repeat the number of iterations N= the; %length of Signal x PSI= Eye (N); The%x itself is sparse, and the sparse matrix is defined as the unit array x=psi*Thetak_set= [4, A, -, -, $]; %sparse set of signal x percentage= Zeros (Length (k_set), N); %probability of successful storage recovery%%main loop, traversing each group (k,m,n) Tic forKK =1: Length (k_set) K= K_set (KK); %this sparse degree m_set=2*k:5N %m no need to traverse all, every 5 test one can be Percentagek= Zeros (1, Length (m_set)); %Store the probability of recovery success for different m under this sparsity K forMM =1: Length (m_set) M= M_set (mm); %the number of observations fprintf ('k=%d,m=%d\n', k,m); P=0;  forCNT =1: CNT%The number of each observation value is run CNT times index_k=randperm (N); X= Zeros (N,1); X (Index_k (1: K)) =5*randn (K,1); %x is k sparse, and the position is random Phi= Randn (m,n)/sqrt (M); %The measurement matrix is a Gaussian matrix a= Phi * PSI; %Sensing Matrix Y= Phi * x; %get the observation vector y Theta= CS_SP (y,a,k); %restoring the reconstructed signal theta X_r= Psi * THETA; % X=PSI *ThetaifNorm (X_r-x) <1e-6% if the residuals are less than 1e-6 is considered a recovery success P= P +1; End end Percentagek (mm)= p/cnt* -; %Calculate recovery probability end Percentage (KK,1: Length (m_set)) =Percentagek;endtocsave SPMtoPercentage1000%It's not easy to run once, all the variables are stored down%%Drawing S= ['-ks';'-ko';'-KD';'-kv';'-k*'];figure; forKK =1: Length (k_set) K=K_set (KK); M_set=2*k:5: N; L_mset=length (M_set); Plot (M_set,percentage (KK,1: L_mset), S (KK,:));draw the recovery signal of x hold On;endhold off;xlim ([0  the]); Legend ('k=4','k=12','k=20','k=28','k=36'); Xlabel ('Number of measurements (M)'); Ylabel ('Percentage recovered'); Title ('Percentage of input signals recovered correctly (n=256) (Gaussian)');

Performance comparison of SP and Cosamp

After running the "experiment and result of measuring number m and the success probability of the reconstruction" in SP and Cosamp respectively, load the related variables and draw them on the same graph to see which one is inferior.

clear All;close all;clc;load CoSaMPMtoPercentage1000; Percentagecosamp=Percentage;load SPMtoPercentage1000; Percentagesp=Percentage; S1= ['-ks';'-ko';'-KD';'-kv';'-k*']; S2= ['-rs';'-ro';'-rd';'-RV';'-r*'];figure; forKK =1: Length (k_set) K=K_set (KK); M_set=2*k:5: N; L_mset=length (M_set); Plot (M_set,percentagecosamp (KK,1: L_mset), S1 (KK,:));the recovery signal to draw X is on; Plot (M_set,percentagesp (KK,1: L_mset), S2 (KK,:));plot the recovery signal for x endhold off;xlim ([0  the]); Legend ('cosak=4','spk=4','cosak=12','spk=12','cosak=20',...    'spk=20','cosak=28','spk=28','cosak=36','spk=36'); Xlabel ('Number of measurements (M)'); Ylabel ('Percentage recovered'); Title ('Percentage of input signals recovered correctly (n=256) (Gaussian)');

Conclusion: In general, SP is better than cosamp (especially when M is small)

Vi. Articles of Reference

http://blog.csdn.net/jbb0523/article/details/45441459

A brief talk on compression perception (24): SubSpace tracing of compression-aware reconstruction algorithm (SP)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.