【DeepLearning】Exercise:PCA and Whitening

Exercise:PCA and Whitening

习题链接:Exercise:PCA and Whitening

pca_gen.m

%%================================================================
%% Step 0a: Load data
%  Here we provide the code to load natural image data into x.
%  x will be a 144 * 10000 matrix, where the kth column x(:, k) corresponds to
%  the raw image data from the kth 12x12 image patch sampled.
%  You do not need to change the code below.

x = sampleIMAGESRAW();
figure(‘name‘,‘Raw images‘);
randsel = randi(size(x,2),200,1); % A random selection of samples for visualization
display_network(x(:,randsel));

%%================================================================
%% Step 0b: Zero-mean the data (by row)
%  You can make use of the mean and repmat/bsxfun functions.

% -------------------- YOUR CODE HERE --------------------
x = x-repmat(mean(x,1),size(x,1),1);

%%================================================================
%% Step 1a: Implement PCA to obtain xRot
%  Implement PCA to obtain xRot, the matrix in which the data is expressed
%  with respect to the eigenbasis of sigma, which is the matrix U.

% -------------------- YOUR CODE HERE --------------------
%xRot = zeros(size(x)); % You need to compute this
sigma = x*x‘ ./ size(x,2);
[u,s,v] = svd(sigma);
xRot = u‘ * x;

%%================================================================
%% Step 1b: Check your implementation of PCA
%  The covariance matrix for the data expressed with respect to the basis U
%  should be a diagonal matrix with non-zero entries only along the main
%  diagonal. We will verify this here.
%  Write code to compute the covariance matrix, covar.
%  When visualised as an image, you should see a straight line across the
%  diagonal (non-zero entries) against a blue background (zero entries).

% -------------------- YOUR CODE HERE --------------------
%covar = zeros(size(x, 1)); % You need to compute this
covar = xRot*xRot‘ ./ size(x,2);

% Visualise the covariance matrix. You should see a line across the
% diagonal against a blue background.
figure(‘name‘,‘Visualisation of covariance matrix‘);
imagesc(covar);

%%================================================================
%% Step 2: Find k, the number of components to retain
%  Write code to determine k, the number of components to retain in order
%  to retain at least 99% of the variance.

% -------------------- YOUR CODE HERE --------------------
%k = 0; % Set k accordingly
eigenvalue = diag(covar);
total = sum(eigenvalue);
tmpSum = 0;
for k=1:size(x,1)
    tmpSum = tmpSum+eigenvalue(k);
    if(tmpSum / total >= 0.9)
        break;
    end
end
%%================================================================
%% Step 3: Implement PCA with dimension reduction
%  Now that you have found k, you can reduce the dimension of the data by
%  discarding the remaining dimensions. In this way, you can represent the
%  data in k dimensions instead of the original 144, which will save you
%  computational time when running learning algorithms on the reduced
%  representation.
%
%  Following the dimension reduction, invert the PCA transformation to produce
%  the matrix xHat, the dimension-reduced data with respect to the original basis.
%  Visualise the data and compare it to the raw data. You will observe that
%  there is little loss due to throwing away the principal components that
%  correspond to dimensions with low variation.

% -------------------- YOUR CODE HERE --------------------
%xHat = zeros(size(x));  % You need to compute this
xRot(k+1:size(x,1), :) = 0;
xHat = u * xRot;

% Visualise the data, and compare it to the raw data
% You should observe that the raw and processed data are of comparable quality.
% For comparison, you may wish to generate a PCA reduced image which
% retains only 90% of the variance.

figure(‘name‘,[‘PCA processed images ‘,sprintf(‘(%d / %d dimensions)‘, k, size(x, 1)),‘‘]);
display_network(xHat(:,randsel));
figure(‘name‘,‘Raw images‘);
display_network(x(:,randsel));

%%================================================================
%% Step 4a: Implement PCA with whitening and regularisation
%  Implement PCA with whitening and regularisation to produce the matrix
%  xPCAWhite. 

%epsilon = 0;
epsilon = 0.1;
%xPCAWhite = zeros(size(x));

% -------------------- YOUR CODE HERE --------------------
xPCAWhite = diag(1 ./ sqrt(diag(s)+epsilon)) * u‘ * x;

%%================================================================
%% Step 4b: Check your implementation of PCA whitening
%  Check your implementation of PCA whitening with and without regularisation.
%  PCA whitening without regularisation results a covariance matrix
%  that is equal to the identity matrix. PCA whitening with regularisation
%  results in a covariance matrix with diagonal entries starting close to
%  1 and gradually becoming smaller. We will verify these properties here.
%  Write code to compute the covariance matrix, covar.
%
%  Without regularisation (set epsilon to 0 or close to 0),
%  when visualised as an image, you should see a red line across the
%  diagonal (one entries) against a blue background (zero entries).
%  With regularisation, you should see a red line that slowly turns
%  blue across the diagonal, corresponding to the one entries slowly
%  becoming smaller.

% -------------------- YOUR CODE HERE --------------------
covar = xPCAWhite * xPCAWhite‘ ./ size(x,2);

% Visualise the covariance matrix. You should see a red line across the
% diagonal against a blue background.
figure(‘name‘,‘Visualisation of covariance matrix‘);
imagesc(covar);

%%================================================================
%% Step 5: Implement ZCA whitening
%  Now implement ZCA whitening to produce the matrix xZCAWhite.
%  Visualise the data and compare it to the raw data. You should observe
%  that whitening results in, among other things, enhanced edges.

%xZCAWhite = zeros(size(x));
xZCAWhite = u * xPCAWhite;

% -------------------- YOUR CODE HERE -------------------- 

% Visualise the data, and compare it to the raw data.
% You should observe that the whitened images have enhanced edges.
figure(‘name‘,‘ZCA whitened images‘);
display_network(xZCAWhite(:,randsel));
figure(‘name‘,‘Raw images‘);
display_network(x(:,randsel));
时间: 2024-10-09 11:20:03

【DeepLearning】Exercise:PCA and Whitening的相关文章

【DeepLearning】Exercise:PCA in 2D

Exercise:PCA in 2D 习题的链接:Exercise:PCA in 2D pca_2d.m close all %%================================================================ %% Step 0: Load data % We have provided the code to load data from pcaData.txt into x. % x is a 2 * 45 matrix, where the

【DeepLearning】Exercise:Sparse Autoencoder

习题的链接:http://deeplearning.stanford.edu/wiki/index.php/Exercise:Sparse_Autoencoder 我的实现: sampleIMAGES.m function patches = sampleIMAGES() % sampleIMAGES % Returns 10000 patches for training load IMAGES; % load images from disk patchsize = 8; % we'll u

【DeepLearning】Exercise:Convolution and Pooling

Exercise:Convolution and Pooling 习题链接:Exercise:Convolution and Pooling cnnExercise.m %% CS294A/CS294W Convolutional Neural Networks Exercise % Instructions % ------------ % % This file contains code that helps you get started on the % convolutional n

【DeepLearning】Exercise: Implement deep networks for digit classification

Exercise: Implement deep networks for digit classification 习题链接:Exercise: Implement deep networks for digit classification stackedAEPredict.m function [pred] = stackedAEPredict(theta, inputSize, hiddenSize, numClasses, netconfig, data) % stackedAEPre

【DeepLearning】Exercise:Softmax Regression

Exercise:Softmax Regression 习题的链接:Exercise:Softmax Regression softmaxCost.m function [cost, grad] = softmaxCost(theta, numClasses, inputSize, lambda, data, labels) % numClasses - the number of classes % inputSize - the size N of the input vector % la

【DeepLearning】Exercise:Self-Taught Learning

Exercise:Self-Taught Learning 习题链接:Exercise:Self-Taught Learning stlExercise.m %% CS294A/CS294W Self-taught Learning Exercise % Instructions % ------------ % % This file contains code that helps you get started on the % self-taught learning. You will

【DeepLearning】Exercise:Learning color features with Sparse Autoencoders

Exercise:Learning color features with Sparse Autoencoders 习题链接:Exercise:Learning color features with Sparse Autoencoders sparseAutoencoderLinearCost.m function [cost,grad] = sparseAutoencoderLinearCost(theta, visibleSize, hiddenSize, ... lambda, spar

【DeepLearning】Exercise:Vectorization

Exercise:Vectorization 习题的链接:Exercise:Vectorization 注意点: MNIST图片的像素点已经经过归一化. 如果再使用Exercise:Sparse Autoencoder中的sampleIMAGES.m进行归一化, 将使得训练得到的可视化权值如下图: 我的实现: 更改train.m的参数设置及训练样本选取 %% STEP 0: Here we provide the relevant parameters values that will % al

【UFLDL】Exercise: Convolutional Neural Network

这个exercise需要完成cnn中的forward pass,cost,error和gradient的计算.需要弄清楚每一层的以上四个步骤的原理,并且要充分利用matlab的矩阵运算.大概把过程总结了一下如下图所示: STEP 1:Implement CNN Objective STEP 1a: Forward Propagation Forward Propagation主要是为了计算输入图片经过神经网络后的输出,这个网络有三层:convolution->pooling->softmax(