Exercise: PCA in 2D

Step 0: Load data

The starter code contains code to load 45 2D data points. When plotted using the scatter function, the results should look like the following:

Step 1: Implement PCA

In this step, you will implement PCA to obtain xrot, the matrix in which the data is "rotated" to the basis comprising made up of the principal components

Step 1a: Finding the PCA basis

Find and , and draw two lines in your figure to show the resulting basis on top of the given data points.

Step 1b: Check xRot

Compute xRot, and use the scatter function to check that xRot looks as it should, which should be something like the following:

Step 2: Dimension reduce and replot

In the next step, set k, the number of components to retain, to be 1

Step 3: PCA Whitening

Step 4: ZCA Whitening

Code

close all

%%================================================================
%% Step 0: Load data
%  We have provided the code to load data from pcaData.txt into x.
%  x is a 2 * 45 matrix, where the kth column x(:,k) corresponds to
%  the kth data point.Here we provide the code to load natural image data into x.
%  You do not need to change the code below.

x = load(‘pcaData.txt‘,‘-ascii‘); % 载入数据
figure(1);
scatter(x(1, :), x(2, :)); % 用圆圈绘制出数据分布
title(‘Raw data‘);

%%================================================================
%% Step 1a: Implement PCA to obtain U
%  Implement PCA to obtain the rotation matrix U, which is the eigenbasis
%  sigma. 

% -------------------- YOUR CODE HERE --------------------
u = zeros(size(x, 1)); % You need to compute this
[n m]=size(x);
% x=x-repmat(mean(x,2),1,m);  %预处理,均值为零 —— 2维,每一维减去该维上的均值
sigma=(1.0/m)*x*x‘; % 协方差矩阵
[u s v]=svd(sigma);

% --------------------------------------------------------
hold on
plot([0 u(1,1)], [0 u(2,1)]); % 画第一条线
plot([0 u(1,2)], [0 u(2,2)]); % 画第二条线
scatter(x(1, :), x(2, :));
hold off

%%================================================================
%% Step 1b: Compute xRot, the projection on to the eigenbasis
%  Now, compute xRot by projecting the data on to the basis defined
%  by U. Visualize the points by performing a scatter plot.

% -------------------- YOUR CODE HERE --------------------
xRot = zeros(size(x)); % You need to compute this
xRot=u‘*x;

% -------------------------------------------------------- 

% Visualise the covariance matrix. You should see a line across the
% diagonal against a blue background.
figure(2);
scatter(xRot(1, :), xRot(2, :));
title(‘xRot‘);

%%================================================================
%% Step 2: Reduce the number of dimensions from 2 to 1.
%  Compute xRot again (this time projecting to 1 dimension).
%  Then, compute xHat by projecting the xRot back onto the original axes
%  to see the effect of dimension reduction

% -------------------- YOUR CODE HERE --------------------
k = 1; % Use k = 1 and project the data onto the first eigenbasis
xHat = zeros(size(x)); % You need to compute this
xHat = u*([u(:,1),zeros(n,1)]‘*x); % 降维
% 使特征点落在特征向量所指的方向上而不是原坐标系上

% --------------------------------------------------------
figure(3);
scatter(xHat(1, :), xHat(2, :));
title(‘xHat‘);

%%================================================================
%% Step 3: PCA Whitening
%  Complute xPCAWhite and plot the results.

epsilon = 1e-5;
% -------------------- YOUR CODE HERE --------------------
xPCAWhite = zeros(size(x)); % You need to compute this
xPCAWhite = diag(1./sqrt(diag(s)+epsilon))*u‘*x;  % 每个特征除以对应的特征向量,以使每个特征有一致的方差
% --------------------------------------------------------
figure(4);
scatter(xPCAWhite(1, :), xPCAWhite(2, :));
title(‘xPCAWhite‘);

%%================================================================
%% Step 3: ZCA Whitening
%  Complute xZCAWhite and plot the results.

% -------------------- YOUR CODE HERE --------------------
xZCAWhite = zeros(size(x)); % You need to compute this
xZCAWhite = u*diag(1./sqrt(diag(s)+epsilon))*u‘*x;

% --------------------------------------------------------
figure(5);
scatter(xZCAWhite(1, :), xZCAWhite(2, :));
title(‘xZCAWhite‘);

%% Congratulations! When you have reached this point, you are done!
%  You can now move onto the next PCA exercise. :)
时间: 2024-12-07 04:35:16

Exercise: PCA in 2D的相关文章

【DeepLearning】Exercise:PCA in 2D

Exercise:PCA in 2D 习题的链接:Exercise:PCA in 2D pca_2d.m close all %%================================================================ %% Step 0: Load data % We have provided the code to load data from pcaData.txt into x. % x is a 2 * 45 matrix, where the

【DeepLearning】Exercise:PCA and Whitening

Exercise:PCA and Whitening 习题链接:Exercise:PCA and Whitening pca_gen.m %%================================================================ %% Step 0a: Load data % Here we provide the code to load natural image data into x. % x will be a 144 * 10000 matr

【转帖】UFLDL Tutorial(the main ideas of Unsupervised Feature Learning and Deep Learning)

UFLDL Tutorial From Ufldl Jump to: navigation, search Description: This tutorial will teach you the main ideas of Unsupervised Feature Learning and Deep Learning.  By working through it, you will also get to implement several feature learning/deep le

UFLDL教程笔记及练习答案二(预处理:主成分分析和白化)

首先将本节主要内容记录下来,然后给出课后习题的答案. 笔记: 1:首先我想推导用SVD求解PCA的合理性. PCA原理:假设样本数据X∈Rm×n,其中m是样本数量,n是样本的维数.PCA降维的目的就是为了使将数据样本由原来的n维降低到k维(k<n).方法是找数据随之变化的主轴,在Andrew Ng的网易公开课上我们知道主方向就是X的协方差所对应的最大特征值所对应的特征向量的方向(前提是这里X在维度上已经进行了均值归一化).在matlab中我们通常可以用princomp函数来求解,详细见:http

UFLDL教程之(三)PCA and Whitening exercise

Exercise:PCA and Whitening 第0步:数据准备 UFLDL下载的文件中,包含数据集IMAGES_RAW,它是一个512*512*10的矩阵,也就是10幅512*512的图像 (a)载入数据 利用sampleIMAGESRAW函数,从IMAGES_RAW中提取numPatches个图像块儿,每个图像块儿大小为patchSize,并将提取到的图像块儿按列存放,分别存放在在矩阵patches的每一列中,即patches(:,i)存放的是第i个图像块儿的所有像素值 (b)数据去均

PCA and kmeans MATLAB实现

MATLAB基础知识 l  Imread:  读取图片信息: l  axis:轴缩放:axis([xmin xmax ymin ymax zmin zmax cmin cmax]) 设置 x.y 和 z 轴范围以及颜色缩放范围(请参阅 caxis).v = axis 返回包含 x.y 和 z 轴缩放因子的行矢量.v 具有 4 或 6 个分量,具体分别取决于当前坐标轴是二维还是三维.返回值是当前坐标轴的 XLim.Ylim 和 ZLim 属性.   基于 x.y 和 z 数据的最小值和最大值,ax

Deep Learning五:PCA and Whitening_Exercise(斯坦福大学UFLDL深度学习教程)

前言 本文是基于Exercise:PCA and Whitening的练习. 理论知识见:UFLDL教程. 实验内容:从10张512*512自然图像中随机选取10000个12*12的图像块(patch),然后对这些patch进行99%的方差保留的PCA计算,最后对这些patch做PCA Whitening和ZCA Whitening,并进行比较. 实验步骤及结果 1.加载图像数据,得到10000个图像块为原始数据x,它是144*10000的矩阵,随机显示200个图像块,其结果如下: 2.把它的每

机器学习week8 ex7 review

机器学习week8 ex7 review 这周学习K-means,并将其运用于图片压缩. 1 K-means clustering 先从二维的点开始,使用K-means进行分类. 1.1 Implement K-means K-means步骤如上,在每次循环中,先对所有点更新分类,再更新每一类的中心坐标. 1.1.1 Finding closest centroids 对每个example,根据公式:找到距离它最近的centroid,并标记.若有数个距离相同且均为最近,任取一个即可.代码如下:

深度学习入门教程UFLDL学习实验笔记三:主成分分析PCA与白化whitening

主成分分析与白化是在做深度学习训练时最常见的两种预处理的方法,主成分分析是一种我们用的很多的降维的一种手段,通过PCA降维,我们能够有效的降低数据的维度,加快运算速度.而白化就是为了使得每个特征能有同样的方差,降低相邻像素的相关性. 主成分分析PCA PCA算法可以将输入向量转换为一个维数低很多的近似向量.我们在这里首先用2D的数据进行试验,其数据集可以在UFLDL网站的相应页面http://ufldl.stanford.edu/wiki/index.php/Exercise:PCA_in_2D