STOR892-9-11-2014 - STOR 892 Object Oriented Data Analysis

advertisement
Participant Presentations
Please Sign Up:
• Name
• Email (Onyen is fine, or …)
• Are You ENRolled?
• Tentative Title (???? Is OK)
• When:
Next Week, Early, Oct., Nov., Late
PCA to find clusters
PCA of Mass Flux Data:
Statistical Smoothing
In 1 Dimension, 2 Major Settings:
•
Density Estimation
“Histograms”
•
Nonparametric Regression
“Scatterplot Smoothing”
Kernel Density Estimation
Chondrite Data:
• Sum pieces to estimate density
• Suggests 3 modes (rock sources)
Scatterplot Smoothing
E.g. Bralower Fossil Data – some smooths
Statistical Smoothing
Fundamental Question
For both of
• Density Estimation: “Histograms”
• Regression: “Scatterplot Smoothing”
Which bumps are “really there”?
vs. “artifacts of sampling noise”?
SiZer Background
Fun Scale Space Views (Incomes Data)
Surface View
SiZer Background
SiZer analysis of British Incomes data:
SiZer Background
SiZer analysis of British Incomes data:
•
•
Oversmoothed: Only one mode
Medium smoothed: Two modes,
statistically significant
Confirmed by Schmitz & Marron, (1992)
•
Undersmoothed: many noise wiggles,
not significant
Again: all are correct,
just different scales
SiZer Background
Scale Space and Kernel Choice,
i.e. Shape of Window
Compelling Answer:
Gaussian
Only “Variation Diminishing” Kernel Shape
I. e. # Modes decreases with bandwidth, h
Lindebergh (1994)
Chaudhuri & Marron (2000)
SiZer Background
Recall
Hidalgo
Stamps
Data
SiZer Overview
Would you like to try smoothing & SiZer?
•
Marron Software Website as Before
•
In “Smoothing” Directory:
•
–
kdeSM.m
–
nprSM.m
–
sizerSM.m
Recall:
“>> help sizerSM”
for usage
PCA to find clusters
Return to PCA of Mass Flux Data:
PCA to find clusters
SiZer analysis of Mass Flux, PC1
PCA to find clusters
SiZer analysis of Mass Flux, PC1
All 3
Signif’t
PCA to find clusters
SiZer analysis of Mass Flux, PC1
Also in
Curvature
PCA to find clusters
SiZer analysis of Mass Flux, PC1
And in
Other
Comp’s
PCA to find clusters
SiZer analysis of Mass Flux, PC1
Conclusion:
•
Found 3 significant clusters!
•
Correspond to 3 known “cloud types”
•
Worth deeper investigation
Recall Yeast Cell Cycle Data
• “Gene Expression” – Micro-array data
• Data (after major preprocessing):
Expression “level” of:
• thousands of genes
(d ~ 1,000s)
• but only dozens of “cases” (n ~ 10s)
• Interesting statistical issue:
High Dimension Low Sample Size data
(HDLSS)
Yeast Cell Cycle Data, FDA View
Central question:
Which genes are “periodic” over 2 cell cycles?
Yeast Cell Cycle Data, FDA View
Periodic
genes?
Naïve
approach:
Simple
PCA
Yeast Cell Cycle Data, FDA View
•
•
•
•
•
•
•
Central question: which genes are “periodic”
over 2 cell cycles?
Naïve approach:
Simple PCA
No apparent (2 cycle) periodic structure?
Eigenvalues suggest large amount of
“variation”
PCA finds “directions of maximal variation”
Often, but not always, same as “interesting
directions”
Here need better approach to study
periodicities
Yeast Cell Cycles, Freq. 2 Proj.
PCA on
Freq. 2
Periodic
Component
Of Data
Frequency 2 Analysis
•
Project data onto 2-dim space of sin and cos
(freq. 2)
•
Useful view: scatterplot
•
Angle (in polar coordinates) shows phase
Approach from Zhao, Marron & Wells (2004)
Frequency 2 Analysis
Frequency 2 Analysis
•
Project data onto 2-dim space of sin and cos
(freq. 2)
•
Useful view: scatterplot
•
Angle (in polar coordinates) shows phase
•
Colors: Spellman’s cell cycle phase
classification
•
Black was labeled “not periodic”
•
Within class phases approx’ly same, but notable
differences
•
Now try to improve “phase classification”
Yeast Cell Cycle
Revisit “phase classification”,
•
•
•
•
•
approach:
Use outer 200 genes
(other numbers tried, less resolution)
Study distribution of angles
Use SiZer analysis
(finds significant bumps, etc., in histogram)
Carefully redrew boundaries
Check by studying k.d.e. angles
SiZer Study of Dist’n of Angles
Reclassification of Major Genes
Compare to Previous Classif’n
New Subpopulation View
New Subpopulation View
Note:
Subdensities
Have Same
Bandwidth &
Proportional
Areas
(so Σ = 1)
Detailed Look at PCA
Now Study “Folklore” More Carefully
• BackGround
• History
• Underpinnings
(Mathematical & Computational)
Good Overall Reference:
Jolliffe (2002)
PCA: Rediscovery – Renaming
Statistics:
Principal Component Analysis (PCA)
PCA: Rediscovery – Renaming
Statistics:
Principal Component Analysis (PCA)
Social Sciences:
Factor Analysis (PCA is a subset)
PCA: Rediscovery – Renaming
Statistics:
Principal Component Analysis (PCA)
Social Sciences:
Factor Analysis (PCA is a subset)
Probability / Electrical Eng:
Karhunen – Loeve expansion
PCA: Rediscovery – Renaming
Statistics:
Principal Component Analysis (PCA)
Social Sciences:
Factor Analysis (PCA is a subset)
Probability / Electrical Eng:
Karhunen – Loeve expansion
Applied Mathematics:
Proper Orthogonal Decomposition (POD)
PCA: Rediscovery – Renaming
Statistics:
Principal Component Analysis (PCA)
Social Sciences:
Factor Analysis (PCA is a subset)
Probability / Electrical Eng:
Karhunen – Loeve expansion
Applied Mathematics:
Proper Orthogonal Decomposition (POD)
Geo-Sciences:
Empirical Orthogonal Functions (EOF)
An Interesting Historical Note
The 1st (?) application of PCA to Functional
Data Analysis
An Interesting Historical Note
The 1st (?) application of PCA to Functional
Data Analysis:
Rao (1958)
1st Paper with “Curves as Data Objects”
viewpoint
Detailed Look at PCA
Three Important (& Interesting) Viewpoints:
1.
Mathematics
2.
Numerics
3.
Statistics
Goal: Study Interrelationships
Detailed Look at PCA
Three Important (& Interesting) Viewpoints:
1.
Mathematics
2.
Numerics
3.
Statistics
1st: Review Linear Alg. and Multivar. Prob.
Review of Linear Algebra
Vector Space:
x,
•
set of “vectors”,
•
and “scalars” (coefficients),
a
Review of Linear Algebra
Vector Space:
x,
•
set of “vectors”,
•
and “scalars” (coefficients),
•
“closed” under “linear combination”
(
a
a x
i
i
i
in space)
Review of Linear Algebra
Vector Space:
x,
•
set of “vectors”,
•
and “scalars” (coefficients),
•
“closed” under “linear combination”
(


 x1 
 

d
e.g.   x     : x1 ,..., xd  
x 


 d

,
a
a x
i
i
i
“ d dim Euclid’n space”
in space)
Review of Linear Algebra (Cont.)
Subspace:
• Subset that is Again a Vector Space
• i.e. Closed under Linear Combination
Review of Linear Algebra (Cont.)
Subspace:
• Subset that is Again a Vector Space
• i.e. Closed under Linear Combination
• e.g. Lines through the Origin
• e.g. Planes through the Origin
Review of Linear Algebra (Cont.)
Subspace:
• Subset that is Again a Vector Space
• i.e. Closed under Linear Combination
• e.g. Lines through the Origin
• e.g. Planes through the Origin
Note: Planes not Through the Origin
are not Subspaces
(Do not Contain 0  x  0)
Review of Linear Algebra (Cont.)
Subspace:
• Subset that is Again a Vector Space
• i.e. Closed under Linear Combination
• e.g. Lines through the Origin
• e.g. Planes through the Origin
• e.g. Subsp. “Generated By” a Set of Vectors
(all Linear Combos of them =
= Containing Hyperplane
through Origin)
Review of Linear Algebra (Cont.)
Basis of Subspace: Set of Vectors that:
•
Span, i.e. Everything is a Lin. Com. of them
•
are Linearly Indep’t, i.e. Lin. Com. is Unique
Review of Linear Algebra (Cont.)
Basis of Subspace: Set of Vectors that:
•
Span, i.e. Everything is a Lin. Com. of them
•
are Linearly Indep’t, i.e. Lin. Com. is Unique

•
e.g.
•
Since
d
 1   0   0 
     

“Unit Vector Basis”  0   1    
 ,  ,...,  
       0 
 0   0   1 
 x1 
1
 0
 0
 
 
 
 
0
1

 x2 



    x1     x 2       x d  0 
 
 
 
 
0
 0
1
 xd 
Review of Linear Algebra (Cont.)
Basis Matrix, of subspace of
Given a basis,

d
v1 ,..., vn ,
create matrix of columns:

B  v1
v1n 
 v11


 vn      
v

vdn  d n
 d1

Review of Linear Algebra (Cont.)
Then linear combo is a matrix multiplicat’n:
n
a v
i 1
i i
 Ba
where
 a1 
 
a  
a 
 n
Review of Linear Algebra (Cont.)
Then linear combo is a matrix multiplicat’n:
n
a v
i 1
i i
 Ba
where
 a1 
 
a  
a 
 n
Note: Right Multiplication Gives:
Linear Combination of Column Vectors
Review of Linear Algebra (Cont.)
Then linear combo is a matrix multiplicat’n:
n
a v
i 1
i i
 Ba
Check sizes:
where
 a1 
 
a  
a 
 n
d 1  (d  n)  (n 1)
Review of Linear Algebra (Cont.)
Aside on Matrix Multiplication: (linear transformat’n)
For matrices
 a1,1  a1, m 
 b1,1  b1, n 




A  
 
B    
a
 ,
b


a

b
k
,
1
k
,
m
m
,
1
m
,
n




Define the Matrix Product
 m
  a1,i bi ,1 
 i 1
AB  


 m
  a k ,i bi ,1 
 i 1

a1,i bi , n 

i 1



m

a
b

k ,i i , n 
i 1

m
Review of Linear Algebra (Cont.)
Aside on Matrix Multiplication: (linear transformat’n)
For matrices
 a1,1  a1, m 
 b1,1  b1, n 




A  
 
B    
a
 ,
b


a

b
k
,
1
k
,
m
m
,
1
m
,
n




Define the Matrix Product
 m
  a1,i bi ,1 
 i 1
AB  


 m
  a k ,i bi ,1 
 i 1
(Inner Products of Rows,

a1,i bi , n 

i 1



m

a
b

k ,i i , n 
i 1

m
A With Columns, B )
Review of Linear Algebra (Cont.)
Aside on Matrix Multiplication: (linear transformat’n)
For matrices
 a1,1  a1, m 
 b1,1  b1, n 




A  
 
B    
a
 ,
b


a

b
k
,
1
k
,
m
m
,
1
m
,
n




Define the Matrix Product
 m
  a1,i bi ,1 
 i 1
AB  


 m
  a k ,i bi ,1 
 i 1

a1,i bi , n 

i 1



m

a
b

k ,i i , n 
i 1

m
(Inner Products of Rows, A With Columns, B )
(Composition of Linear Transformations)
Review of Linear Algebra (Cont.)
Aside on Matrix Multiplication: (linear transformat’n)
For matrices
 a1,1  a1, m 
 b1,1  b1, n 




A  
 
B    
a
 ,
b


a

b
k
,
1
k
,
m
m
,
1
m
,
n




Define the Matrix Product
 m
  a1,i bi ,1 
 i 1
AB  


 m
  a k ,i bi ,1 
 i 1

a1,i bi , n 

i 1



m

a
b

k ,i i , n 
i 1

m
(Inner Products of Rows, A With Columns, B )
(Composition of Linear Transformations)
Often Useful to Check Sizes: k  n  k  m  m  n
Review of Linear Algebra (Cont.)
Aside on Matrix Multiplication: (linear transformat’n)
For matrices
 a1,1  a1, m 
 b1,1  b1, n 




A  
 
B    
a
 ,
b


a

b
k
,
1
k
,
m
m
,
1
m
,
n




Define the Matrix Product
 m
  a1,i bi ,1 
 i 1
AB  


 m
  a k ,i bi ,1 
 i 1

a1,i bi , n 

i 1



m

a
b

k ,i i , n 
i 1

m
(Inner Products of Rows, A With Columns, B )
(Composition of Linear Transformations)
Often Useful to Check Sizes: k  n  k  m  m  n
Review of Linear Algebra (Cont.)
Matrix Trace:
• For a Square Matrix
m
• Define tr ( A)   ai ,i
i 1
 a1,1  a1, m 


A  
 
a


a
m,m 
 m,1
Review of Linear Algebra (Cont.)
Matrix Trace:
• For a Square Matrix
m
• Define tr ( A)   ai ,i
 a1,1  a1, m 


A  
 
a


a
m,m 
 m,1
i 1
• Trace Commutes with Matrix Multiplication:
tr  AB   tr  BA
Review of Linear Algebra (Cont.)
Dimension of Subspace (a Notion of “Size”):
•
Number of Elements in a Basis (Unique)
Review of Linear Algebra (Cont.)
Dimension of Subspace (a Notion of “Size”):
•
Number of Elements in a Basis (Unique)
•
dim
•
e.g.
dim of a line is 1
•
e.g.
dim of a plane is 2
d
 d
(Use Basis Above)
Review of Linear Algebra (Cont.)
Dimension of Subspace (a Notion of “Size”):
•
Number of Elements in a Basis (Unique)
•
dim
•
e.g.
dim of a line is 1
•
e.g.
dim of a plane is 2
•
Dimension is “Degrees of Freedom”
d
 d
(Use Basis Above)
(in Statistical Uses, e.g. ANOVA)
Review of Linear Algebra (Cont.)
Norm of a Vector:
• in

d,
1/ 2

2
x    x j 
 j 1 
d
 
 x x
t
1/ 2
Review of Linear Algebra (Cont.)
Norm of a Vector:
• in

d,
1/ 2

2
x    x j 
 j 1 
d
• Idea: length of the vector
 
 x x
t
1/ 2
Review of Linear Algebra (Cont.)
Norm of a Vector:
• in

d,
1/ 2

2
x    x j 
 j 1 
d
 
 x x
t
1/ 2
• Idea: length of the vector
• Note:
 strange properties for high d ,
e.g. “length of diagonal of unit cube” = d
Review of Linear Algebra (Cont.)
Norm of a Vector (cont.):
• Length Normalized Vector:
x
x
(has Length 1, thus on Surf. of Unit Sphere
& is a Direction Vector)
Review of Linear Algebra (Cont.)
Norm of a Vector (cont.):
x
x
• Length Normalized Vector:
(has Length 1, thus on Surf. of Unit Sphere
& is a Direction Vector)
• Define Distance as:
d x , y   x  y 
x  y  x  y 
t
Review of Linear Algebra (Cont.)
Inner (Dot, Scalar) Product:
d
x, y   x j y j  x y
j 1
• for Vectors
x
and
y
t
Review of Linear Algebra (Cont.)
Inner (Dot, Scalar) Product:
d
x, y   x j y j  x y
t
j 1
• for Vectors
x
and
y,
• Related to Norm, via
x  x, x
1/ 2
Review of Linear Algebra (Cont.)
Inner (Dot, Scalar) Product (cont.):
• measures “angle between
 x, y
1 
anglex, y   cos
 x y

x and y ” as:
t



x
y

  cos 1 

 xt x  yt y 



Review of Linear Algebra (Cont.)
Inner (Dot, Scalar) Product (cont.):
• measures “angle between
 x, y
1 
anglex, y   cos
 x y

x and y ” as:
t



x
y

  cos 1 

 xt x  yt y 



• key to Orthogonality, i.e. Perpendicul’ty:
x y
if and only if
x, y  0
Review of Linear Algebra (Cont.)
Orthonormal Basis v1 ,..., vn :
• All Orthogonal to each other,
i.e. vi , vi '  0 ,
for
i  i'
• All have Length 1,
i.e. vi , vi  1,
for i  1,..., n
Review of Linear Algebra (Cont.)
Orthonormal Basis
v1 ,..., vn (cont.):
n
• Spectral Representation:
x   a i vi
i 1
where
ai  x, vi
Review of Linear Algebra (Cont.)
Orthonormal Basis
v1 ,..., vn (cont.):
n
• Spectral Representation:
x   a i vi
i 1
where
ai  x, vi
(Coefficient is Inner Product, Cool Notation)
Review of Linear Algebra (Cont.)
v1 ,..., vn (cont.):
Orthonormal Basis
n
x   a i vi
• Spectral Representation:
i 1
ai  x, vi
where
Check:
x, v i 
n
a v ,v
i '1
i' i'
n
i
  a i ' vi ' , vi  a i
i '1
Review of Linear Algebra (Cont.)
v1 ,..., vn (cont.):
Orthonormal Basis
n
x   a i vi
• Spectral Representation:
i 1
ai  x, vi
where
Check:
x, v i 
n
a v ,v
i '1
i' i'
n
i
  a i ' vi ' , vi  a i
i '1
t
t
x

B
a
• Matrix Notation:
where a  x B

For the Basis Matrix B  v1  vn

t
a

B
x
i.e.
Review of Linear Algebra (Cont.)
v1 ,..., vn (cont.):
Orthonormal Basis
n
x   a i vi
• Spectral Representation:
i 1
ai  x, vi
where
Check:
x, v i 
n
a v ,v
i '1
i' i'
n
i
  a i ' vi ' , vi  a i
i '1
t
t
x

B
a
• Matrix Notation:
where a  x B
a
is called transform of
(e.g.
x
Fourier or Wavelet)
t
a

B
x
i.e.
Review of Linear Algebra (Cont.)
Parseval identity, for x
in subsp. gen’d by o. n. basis v1 ,..., vn :
n
x   x, vi
2
i 1
2
n
 a  a
i 1
2
i
2
Review of Linear Algebra (Cont.)
Parseval identity, for x
in subsp. gen’d by o. n. basis v1 ,..., vn :
n
x   x, vi
2
i 1
2
n
 a  a
i 1
2
i
2
• Pythagorean theorem
• “Decomposition of Energy”
• ANOVA - sums of squares
Review of Linear Algebra (Cont.)
Parseval identity, for x
in subsp. gen’d by o. n. basis v1 ,..., vn :
n
x   x, vi
2
i 1
2
n
 a  a
i 1
2
i
2
• Pythagorean theorem
• “Decomposition of Energy”
• ANOVA - sums of squares
• Transform, a , has same length as x ,
i.e. “rotation in  d ”
Review of Linear Algebra (Cont.)
Projection of a Vector
x onto a Subspace V :
• Idea: Member of V that is Closest to
(i.e. “Best Approx’n”)
x
Review of Linear Algebra (Cont.)
Projection of a Vector
x onto a Subspace V :
• Idea: Member of V that is Closest to
(i.e. “Best Approx’n”)
• Find PV x  V that Solves:
min x  v
vV
(“Least Squares”)
x
Review of Linear Algebra (Cont.)
Projection of a Vector
x onto a Subspace V :
• Idea: Member of V that is Closest to
(i.e. “Best Approx’n”)
• Find PV x  V that Solves:
min x  v
vV
(“Least Squares”)
• For Inner Product (Hilbert) Space:
PV x Exists and is Unique
x
Review of Linear Algebra (Cont.)
Projection of a Vector onto a Subspace (cont.):
• General Solution in  : for Basis Matrix BV ,
d
PV x  BV B BV  B x
t
V
1
t
V
Review of Linear Algebra (Cont.)
Projection of a Vector onto a Subspace (cont.):
• General Solution in  : for Basis Matrix BV ,
d
PV x  BV B BV  B x
1
t
V
t
V
• So Proj’n Operator is Matrix Mult’n:

PV  BV B BV
t
V

1
BVt
(thus projection is another linear operation)
Review of Linear Algebra (Cont.)
Projection of a Vector onto a Subspace (cont.):
• General Solution in  : for Basis Matrix BV ,
d
PV x  BV B BV  B x
1
t
V
t
V
• So Proj’n Operator is Matrix Mult’n:

PV  BV B BV
t
V

1
BVt
(thus projection is another linear operation)
(note same operation underlies least squares)
Review of Linear Algebra (Cont.)
Projection using Orthonormal Basis v1 ,..., vn :
• Basis Matrix is Orthonormal:
 v ,v
 v1t 
 1 1
 
   v1  vn   

 t
 vn 
 vn , v1
 






BVt BV  I nn
v1 , vn   1  0 




  
 
vn , vn   0  1 

Review of Linear Algebra (Cont.)
Projection using Orthonormal Basis v1 ,..., vn :
• Basis Matrix is Orthonormal:
 v ,v
 v1t 
 1 1
 
   v1  vn   

 t
 vn 
 vn , v1
 






BVt BV  I nn
v1 , vn   1  0 




  
 
vn , vn   0  1 

• So PV x  BV B x  =
t
V
= Recon(Coeffs of
x “in V Dir’n”)
(Recall Right Mult’n)
Review of Linear Algebra (Cont.)
Projection using Orthonormal Basis (cont.):

V
• For Orthogonal Complement,
,
x  PV x  PV  x
and
x  PV x  PV  x
2
2
2
Review of Linear Algebra (Cont.)
Projection using Orthonormal Basis (cont.):

V
• For Orthogonal Complement,
,
x  PV x  PV  x
x  PV x  PV  x
2
and
2
• Parseval Inequality:
n
PV x  x   x, vi
2
2
i 1
2
n
  ai2  a
i 1
2
2
Review of Linear Algebra (Cont.)
(Real) Unitary Matrices: U d d with
• Orthonormal Basis Matrix
(So All of Above Applies)
U tU  I
Review of Linear Algebra (Cont.)
(Real) Unitary Matrices: U d d with
U tU  I
• Orthonormal Basis Matrix
(So All of Above Applies)
• Note Transform’n is: Distance Preserving



 
d U x, U y  U x  y 
n
i 1
xi  yi 
2
 
 x  y  d x, y
Review of Linear Algebra (Cont.)
(Real) Unitary Matrices: U d d with
U tU  I
• Orthonormal Basis Matrix
(So All of Above Applies)
• Note Transform’n is: Distance Preserving



 
d U x, U y  U x  y 
n
i 1
xi  yi 
2
 
 x  y  d x, y
• Lin. Trans. (Mult. by U ) is ~ Rotation
• But also Includes “Mirror Images”
Review of Linear Algebra (Cont.)
Singular Value Decomposition (SVD):
For a Matrix X d n
Find
Review of Linear Algebra (Cont.)
Singular Value Decomposition (SVD):
For a Matrix X d n
Find a Diagonal Matrix S d n ,
with Entries
s1 ,..., smin( d , n )
called Singular Values
Review of Linear Algebra (Cont.)
Singular Value Decomposition (SVD):
For a Matrix X d n
Find a Diagonal Matrix S d n ,
with Entries
s1 ,..., smin( d , n )
called Singular Values
And Unitary (Rotation) Matrices U d d , Vnn
(recall U tU  V tV  I )
Review of Linear Algebra (Cont.)
Singular Value Decomposition (SVD):
For a Matrix X d n
Find a Diagonal Matrix S d n ,
with Entries
s1 ,..., smin( d , n )
called Singular Values
And Unitary (Rotation) Matrices U d d , Vnn
(recall U tU  V tV  I )
So That X  USV t
Review of Linear Algebra (Cont.)
Intuition behind Singular Value Decomposition:
• For X a “linear transf’n” (via matrix multi’n)
X  v  U  S  V t  v  U  S  V t  v 
Review of Linear Algebra (Cont.)
Intuition behind Singular Value Decomposition:
• For X a “linear transf’n” (via matrix multi’n)
X  v  U  S  V t  v  U  S  V t  v 
• First rotate
Review of Linear Algebra (Cont.)
Intuition behind Singular Value Decomposition:
• For X a “linear transf’n” (via matrix multi’n)
X  v  U  S  V t  v  U  S  V t  v 
• First rotate
• Second rescale coordinate axes (by si )
Review of Linear Algebra (Cont.)
Intuition behind Singular Value Decomposition:
• For X a “linear transf’n” (via matrix multi’n)
X  v  U  S  V t  v  U  S  V t  v 
• First rotate
• Second rescale coordinate axes (by si )
• Third rotate again
Review of Linear Algebra (Cont.)
Intuition behind Singular Value Decomposition:
• For X a “linear transf’n” (via matrix multi’n)
X  v  U  S  V t  v  U  S  V t  v 
• First rotate
• Second rescale coordinate axes (by si )
• Third rotate again
• i.e. have diagonalized the transformation
Review of Linear Algebra (Cont.)
SVD Compact Representation:
Useful Labeling:
s1    smin( n ,d )
Singular Values in Increasing Order
Review of Linear Algebra (Cont.)
SVD Compact Representation:
Useful Labeling:
s1    smin( n ,d )
Singular Values in Increasing Order
Note: singular values = 0 can be omitted
(Since do “0-Stretching”)
Review of Linear Algebra (Cont.)
SVD Compact Representation:
Useful Labeling:
s1    smin( n ,d )
Singular Values in Increasing Order
Note: singular values = 0 can be omitted
Let
r = # of positive singular values
Review of Linear Algebra (Cont.)
SVD Compact Representation:
Useful Labeling:
s1    smin( n ,d )
Singular Values in Increasing Order
Note: singular values = 0 can be omitted
Let
r = # of positive singular values
Then:
Where
X  U d r SrrVnr
t
are truncations of U , S , V
Review of Linear Algebra (Cont.)
SVD Full Representation:
X d n
=
U d d
Graphics Display Assumes
S d n
d n
V t n n
Review of Linear Algebra (Cont.)
SVD Full Representation:
X d n
=
U d d
Full Rank Basis Matrix
S d n
V t n n
Review of Linear Algebra (Cont.)
SVD Full Representation:
X d n
=
U d d
S d n
Full Rank Basis Matrix
All 0s in Bottom
V t n n
Review of Linear Algebra (Cont.)
SVD Reduced Representation:
X d n
=
U d d
S nn
0 d  n n
These Columns Get 0ed Out
V t n n
Review of Linear Algebra (Cont.)
SVD Reduced Representation:
X d n
=
U d n
S nn
V t n n
Review of Linear Algebra (Cont.)
SVD Reduced Representation:
X d n
=
U d n
S nn
Also, Some of These May be 0
V t n n
Review of Linear Algebra (Cont.)
SVD Compact Representation:
S r r
X d n
=
U d r
V t r n
0
Review of Linear Algebra (Cont.)
SVD Compact Representation:
S r r
X d n
=
U d r
These Get 0ed Out
V t r n
0
Review of Linear Algebra (Cont.)
SVD Compact Representation:
S r r
X d n
=
U d r
V t r n
Review of Linear Algebra (Cont.)
Eigenvalue Decomposition:
For a (Symmetric) Square Matrix X d d
Find
Review of Linear Algebra (Cont.)
Eigenvalue Decomposition:
For a (Symmetric) Square Matrix X d d
Find a Diagonal Matrix
 1  0 


D    
0   

d 
Called Eigenvalues
Convenient Ordering: 1    n
Review of Linear Algebra (Cont.)
Eigenvalue Decomposition:
For a (Symmetric) Square Matrix X d d
Find a Diagonal Matrix
 1  0 


D    
0   

d 
And an Orthonormal Matrix Bd d
(i.e. B  B  B  B  I d d )
t
t
Review of Linear Algebra (Cont.)
Eigenvalue Decomposition:
For a (Symmetric) Square Matrix X d d
Find a Diagonal Matrix
 1  0 


D    
0   

d 
And an Orthonormal Matrix Bd d
(i.e. B  B  B  B  I d d )
t
So that: X  B  B  D,
t
i.e. X  B  D  B t
Review of Linear Algebra (Cont.)
Eigenvalue Decomposition (cont.):
• Relation to Singular Value Decomposition
(looks similar?)
Review of Linear Algebra (Cont.)
Eigenvalue Decomposition (cont.):
• Relation to Singular Value Decomposition
(looks similar?):
• Eigenvalue Decomposition “Looks Harder”
• Since Needs U  V
Review of Linear Algebra (Cont.)
Eigenvalue Decomposition (cont.):
• Relation to Singular Value Decomposition
(looks similar?):
• Eigenvalue Decomposition “Looks Harder”
• Since Needs U  V
• Price is Eigenvalue Decomp’n is Generally
Complex
(uses i   1)
Review of Linear Algebra (Cont.)
Eigenvalue Decomposition (cont.):
• Relation to Singular Value Decomposition
(looks similar?):
• Eigenvalue Decomposition “Looks Harder”
• Since Needs U  V
• Price is Eigenvalue Decomp’n is Generally
Complex
(uses i   1)
• Except for X Square and Symmetric
• Then Eigenvalue Decomp. is Real Valued
• Thus is the Sing’r Value Decomp. with:
U V  B
Download