文件名称:
PCA降维算法讲义ppt slides
开发工具:
文件大小: 10mb
下载次数: 0
上传时间: 2019-03-01
详细说明:PCA降维算法讲义ppt slides,pca降维算法,课程资源,pptIntrinsic latent dimensions
In this dataset, there is only 3 degrees of freedom of variability,
corresponding to vertical and horizontal translations, and the rotations
333[3
Each image undergoes a random displacement and rotation within
some larger image field
The resulting images have 100 X 100=10,000 pixels
Generative View
Each data example generated by first selecting a point from a
distribution in the latent space, then generating a point from the
conditional distribution in the input space
Simplest latent variable models: Assume Gaussian
distribution for both latent and observed variables
This leads to probabilistic formulation of the Principal
Component Analysis and Factor Analysis
We will first look at standard pca, and then consider its probabilistic
formation
Advantages of probabilistic formulation: use of EM for parameter
estimation, mixture of Pcas, bayesian PCa
Principal Component Analysis
Used for data compression, visualization, feature extraction, dimensionality
reduction
The goal is find M principal components
underlying d-dimensional data
select the top M eigenvectors of s(data
covariance matrix ur,,uMJ
project each input vector x into thIs subspace
Full projection into M dimensions
Two views/derivations
takes form
Maximize variance(scatter of green
points)
x1
·●
Minimize error(red-green distance
per data point)
Maximum variance formulation
Consider a dataset X,,.XNJ, Xn E RD Our goal is to project data onto a
space having dimensionality M< d
Consider the projection into m=1 dimensional space
Define the direction of this space using a
D-dimensional unit vector u, so that ui u1=l
Objective: maximize the variance of the projected
data with respect to u1
∑{uxn-ux}2=uSu1
m=1
where sample mean and data covariance is given by
1
∑(xn-x)(xn-x)1
=1
Maximum variance formulation
Maximize the variance of the projected data
U1X
2
1
=1
Must constrain u1=1. Using Langrage
multiplier, maximize
Su1+入(1
Setting the derivative with respect to u, to zero
Su1=入1
Hence u, must be an eigenvector of s
The maximum variance of the projected data is given by
Optimal u, is principal component (eigenvector with maximal eigenvalue)
Minimum error formulation
Introduce a complete orthonormal set of D-dimensional basis vectors
u
Without loss of generality, we can write
D
T
n2
X飞
Rotation of the coordinate system to a
new system defined by u
Our goal is to represent data points by the projection into M-dimensional
subspace(plus some distortion)
Represent m-dim linear subspace by the first m of the basis vectors
M
xmu;十
=M+1
Minimum error formulation
Represent M-dim linear subspace by the first m of the basis vectors
xnul;十
bi l
ui
=1
元=M+1
where Zni depend on the particular data point and b
are constants
Objective: minimize distortion with respect to u;, Z
and b
N
=1
X U
Minimizing with respect to znj bi
T
Hence, the objective reduces to
=1=M+1
i=M+1
Minimum error formulation
Minimize distortion with respect to u; constraint minimization problem
=1
乙=M+1
The general solution is obtained by choosing u; to
be eigenvectors of the covariance matrix
Suz=入;u
The distortion is then given by: J
元=M+1
The objective is minimized when the remaining D-M components are the
eigenvectors of s with lowest eigenvalues>same result
We will later see a generalization deep autoencoders
(系统自动生成,下载前可以参看下载内容)
下载文件列表
相关说明
- 本站资源为会员上传分享交流与学习,如有侵犯您的权益,请联系我们删除.
- 本站是交换下载平台,提供交流渠道,下载内容来自于网络,除下载问题外,其它问题请自行百度。
- 本站已设置防盗链,请勿用迅雷、QQ旋风等多线程下载软件下载资源,下载后用WinRAR最新版进行解压.
- 如果您发现内容无法下载,请稍后再次尝试;或者到消费记录里找到下载记录反馈给我们.
- 下载后发现下载的内容跟说明不相乎,请到消费记录里找到下载记录反馈给我们,经确认后退回积分.
- 如下载前有疑问,可以通过点击"提供者"的名字,查看对方的联系方式,联系对方咨询.