UFLDL Teaching (iii) PCA and whitening exercise

Source: Internet
Author: User
EXERCISE:PCA and Whitening

No. 0 Step: Data Preparation

UFLDL The downloaded file contains the dataset Images_raw, which is a 512*512*10 matrix, which is 10 images of 512*512

(a) data-loading

Using the Sampleimagesraw function, extract the numpatches image blocks from the Images_raw, each image block size is patchsize, and the extracted image blocks are stored in columns, respectively, in each column of the matrix patches, That is, patches (:, i) holds all the pixel values of the first image block

(b) Data de-value processing

The average pixel value of the image block is subtracted from all pixel values of each image block to achieve the de-averaging of the data.

The following figure shows a randomly selected block of images

First step: Perform PCA

This section is divided into two parts

(1) PCA calculation, where only the data x is rotated to get xrot, without the extraction of the principal component

In particular:

① covariance matrix Sigma for calculating data x

② the characteristic decomposition of Sigma, using the EIG function of Matlab to get the matrix U of Sigma's characteristic vectors

[U,s,v]=eig (Sigma);

U=[u1,..., UI,..., un], each of its columns is a characteristic vector of Sigma, and N is the feature dimension of the input data

S=diag ([λ1,... λi,..., Λn]) is a diagonal array with the characteristic value of sigma as the diagonal element, and the UI and λi correspond;

In order to follow up the calculation, we need to change the order of each column of U, so that the corresponding characteristic value of each column is descending successively.

The changed matrix is still remembered as u, and the corresponding characteristic value diagonal array is still s, i.e.:

U=[u1,..., UI,..., Un],s=diag ([λ1,... λi,..., Λn]), meet: Λ1>=...>=λi>=...>=λn

③ uses the matrix U to rotate the data x to get Xrot, i.e. Xrot=u ' *x

(2) after the rotation of the data to solve the covariance matrix covar, and visualize it, to observe whether the selected data are correct

PCA guarantees that the covariance matrix of the selected data is a diagonal array, if the Covar is correct

Then its image should be a blue background, and there is a slash in the diagonal position

This shows the covariance matrix Covar utilizes the imagesc of MATLAB, and the function is really powerful.

The function of Imagesc (Covar) is to display the matrix Covar as an image, and the different values in the matrix will be given different colors.

The resulting covariance matrix image is as follows: You can see that the image is diagonally positioned outside the same color as the rest

  

The second step: the number of principal components satisfying the condition

In this section, find the number of principal components that satisfy the condition K

That is to find the smallest k value, so that (λ1+...+λk)/(λ1+...+λn) > A percentage, such as 99%

Step Three: Use the number of principal components found to reduce the dimension of the data

In the second step, the number K has been found, that is, the K-principal component of preserving the data satisfies the requirement

In this step, the data X will be reduced to a dimension, leaving only K principal components, to get Xtidle

At the same time, in order to observe the quality of the data after dimensionality reduction, we can use U (:, k) to transform the data of the reduced dimension to the original dimension, that is to get the approximate recovery data of the original data.

and using the mesh to restore the image displayed, compared with the original image, the following first picture is a reduced dimension after the restoration of the original data, the image below is the corresponding original data, it can be found that the data after the reduction can basically recover data very similar to the original data

Fourth Step: PCA Whitening + regularization

This section is divided into two steps

(1) Implementation of PCA with albinism and regularization

First, rotate the data (using the feature matrix U)

The rotated data is then scaled using the eigenvalues to achieve whitening

At the same time, the eigenvalues are fine-tuned by using the parameter ε to achieve regularization when using the eigenvalue scaling.

(b) Calculate the covariance matrix of the data after the hundred, and observe the covariance matrix

If a regularization item is added, the diagonal element of the covariance matrix is less than 1

If you do not include a regular item (that is, rotation + whitening only), the diagonal element of the covariance matrix is 1 (in fact, the ε is a very small number)

The image below shows the covariance matrix corresponding to the albino data, the result of which is added to the regularization, and the image below is the result of not joining the regularization.

Fifth Step: Zca whitening

Zca Whitening, is on the basis of PCA whitening made a rotation, that is

The first image below is the result graph of Zca whitening, and the second picture is the corresponding original

As you can see, the result diagram of Zca whitening appears to be the edge of the original image

Below, is the Pca_gen Code of the Section

CLC clear close all%%================================================================ percent Step 0a:load data% here we P Rovide the code to load natural image data into X.% x would be a 144 * 10000 matrix, where the kth column x (:, K) Corresp
Onds to% of the raw image data from the kth 12x12 image patch sampled.
% you don't need to change the code below. x = Sampleimagesraw ();% reads some images from Images_raw patches figure (' name ', ' raw images ');% shows a figure with the title Raw images Randsel = Randi (Size (x,2), 200, 1); % A random selection of samples for visualization display_network (x (:, Randsel));% display randomly selected image block percent Step 0b:zero-mean the DAT
A (by row)% mean and repmat/bsxfun functions. %--------------------YOUR CODE here--------------------X=x-repmat (mean (x), size (x,1), 1), all elements of each column of%x are subtracted from the mean of the column%%==== ============================================================ percent Step 1a:implement PCA to obtain xRot% Implement PCA to Obtain Xrot, the matrix in which the data was expressed% with RespeCT to the eigenbasis of Sigma, which are the matrix U.%--------------------YOUR CODE here--------------------Xrot = Z Eros (Size (x)); % need to compute this% computes the covariance matrix and carries out eigenvalue decomposition m=size (x,2);% input sample Count sigma=x*x '/m;% input data covariance matrix [U,s,v]=eig (sigma);% covariance matrix
Eigenvalue decomposition [S_value,s_index]=sort (Diag (S), ' descend ');% extracts the diagonal elements of S, sorts them in descending order, Sindex is the sorted number U=u (:, S_index);
S=diag (S_value);

% of the data are rotated xrot=u ' *x;   Percent Step 1b:check your implementation of PCA% the covariance matrix for the data expressed with respect to the basis U% Should is a diagonal matrix with Non-zero entries only along the main% diagonal.
We'll verify this here. 
% Write code to compute the covariance matrix, Covar. % when visualised as an image, you should see a straight line across the% diagonal (Non-zero entries) against a blue BA
Ckground (zero entries). %--------------------YOUR CODE here--------------------Covar = zeros (Size (x, 1)); % need to compute this Covar=xrot*xrot '/m;% the data corresponding to the covariance matrix after the rotation of the data visualise the CovaRiance Matrix.
You should see a line across the% diagonal against a blue background.
Figure (' name ', ' visualisation of covariance matrix ');

Imagesc (Covar);   %%================================================================ percent Step 2:find K, the number of the components to retain%
Write code to determine K, the number of the retain in order% to retain at least 99% of the variance. %--------------------YOUR CODE here--------------------k = 0;
% Set K accordingly s_diag=diag (S);
S_sum=sum (S_DIAG);
    For K=1:size (x,1) sk_sum=sum (S_diag (1:k));
    if sk_sum/s_sum>=0.99 break; End end%%================================================================ percent Step 3:implement PCA with Dimension Reduc tion% now so you had found K, you can reduce the dimension of the data by% discarding the remaining dimensions. In this is represent the% data in K dimensions instead of the original 144, which would save you% computation Al time when running LearniNG algorithms on the reduced% representation. % following the dimension reduction, invert the PCA transformation to produce% The matrix xhat, the Dimension-reduc
Ed data with respect to the original basis. % visualise the data and compare it to the raw data. You'll observe that% there are little loss due to throwing away the principal of components that percent correspond to Dimensio

NS with low variation.
%--------------------YOUR CODE here--------------------% of the data is reduced to xtidle=u (:, 1:k) ' *x;  % recovery of data using reduced-dimensionality data Xtidle xhat = zeros (size (x));

% need to compute this xhat = U*[xtidle;zeros (M-k,size (x,2))]; % visualise the data, and compare it to the raw data% should observe that the raw and processed data is of COMPARABL
E quality.

% For comparison, wish to generate a PCA reduced image which% retains only 90% of the variance.
Figure (' Name ', [' PCA processed images ', sprintf (' (%d/%d dimensions) ', K, size (x, 1));
Display_network (Xhat (:, Randsel)); Figure (' Name ', 'Raw images ');

Display_network (x (:, Randsel)); %%================================================================ percent Step 4a:implement PCA with whitening and 
Regularisation% Implement PCA with whitening and regularisation to produce the matrix% xpcawhite.
Epsilon = 0.000001;
%--------------------YOUR CODE here--------------------xpcawhite = zeros (size (x));
Xpcawhite=diag (1./sqrt (S_diag+epsilon)) *xrot; Percent Step 4b:check your implementation of PCA whitening% Check Your implementation of PCA whitening with and without Reg 
Ularisation. % PCA whitening without regularisation results a covariance matrix% that's equal to the identity matrix. PCA whitening with regularisation% results in a covariance matrix with diagonal entries starting close to% 1 and grad Ually becoming smaller.
We'll verify these properties here. 
% Write code to compute the covariance matrix, Covar. % without regularisation (set Epsilon to 0 or close to 0),% if visualised as an image, you sHould see a red line across the% diagonal (one entries) against a blue background (zero entries). % with regularisation, you should see a red line then slowly turns% blue across the diagonal, corresponding to the one
Entries slowly% becoming smaller.
%--------------------YOUR CODE here--------------------covar=xpcawhite*xpcawhite '/m; % visualise the covariance matrix.
You should see a red line across the% diagonal against a blue background.
Figure (' name ', ' visualisation of covariance matrix ');

Imagesc (Covar); %%================================================================ percent Step 5:implement ZCA whitening% now Implement ZC 
A whitening to produce the Matrix Xzcawhite. % visualise the data and compare it to the raw data.
You should observe% this whitening results in, among other things, enhanced edges.
Xzcawhite = Zeros (size (x)); %--------------------YOUR CODE here--------------------Xzcawhite=u*xpcawhite;%zca Whitening is done on PCA whitening based on a spin% visualise the Data, and compare It to the raw data.
% you should observe that the whitened images has enhanced edges.
Figure (' name ', ' ZCA whitened images ');
Display_network (Xzcawhite (:, Randsel));
Figure (' name ', ' Raw images '); Display_network (x (:, Randsel));



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.