Uploaded by wilsonwei111

matlab proj eigenface tutorial

advertisement
Linear Algebra: Final Project
Instructor: Prof. Carrson C.Fung
Final project

In this project, you are asked

Program different parts of the “eigenface” facial identification

Show your results quantitatively by plotting the so-called miss
detection rate vs. signal-to-noise ratio (SNR).

Answer some additional questions
UEE 1102: Engineering
Mathematics: Linear Algebra
2
Eigenface – face detection
Training set
Testing set
UEE 1102: Engineering
Mathematics: Linear Algebra
SNR = 5dB
SNR = -5dB
SNR = 0dB
SNR = -10dB
3
Facial Identification (and Recognition)
using Eigenface


Uses the idea of low rank approximation
Pro


Easy to implement
Cons




Complete retraining is necessary when new faces
become available
Not robust again occlusion, shadow, …
Use of global information (correlation matrix of all
data) fails to capture local information of images
In terms of identification, detector is not optimal
UEE 1102: Engineering
Mathematics: Linear Algebra
4
Eigenface Training Steps

Treat each new (face) image as a point in Hilbert space H


N×N pixel image  N2 point in H
M training images, usually M<N2

It is usually the dimension of the data which causes problem in such problems,
not the number of data



Curse of dimensionality
Big data is a misnomer
Training phase

Calculate average of face data and subtract from each training data



Let f1, f2, …, fM∈N^2 be the training images, and γ∈N^2 denotes their mean
g𝑖𝑖 = f𝑖𝑖 − 𝛄𝛄, ∀𝑖𝑖 = 1, … 𝑀𝑀
Calculate corresponding correlation matrix


C=GGT∈N^2×N^2 =UCΛC1/2ΛCT/2UCT,
G=[g1 g2 … gM]∈N^2×M


UC, ΛC∈N^2×N^2
Note that C will usually be rank deficient as N2>>M
UC(:,1:M) is a basis for gi, for all i
UEE 1102: Engineering
Mathematics: Linear Algebra
5
Data representation
UEE 1102: Engineering
Mathematics: Linear Algebra
6
Eigenface Training Steps

Only need to use M eigenvectors to represent all training data


Calculating the N2×N2 UC matrix is computationally inefficient
and not necessary
Consider that if R=GTG∈M×M and symmetric






R=URΛR1/2ΛRT/2URT∈M×M, then UR∈M×M, ΛR∈M×M
In matrix vector form: RuR,i=GTGuR,i=λR,iuR,i
Notice that GRuR,i=GGTGuR,i=CGuR,i=λR,iGuR,i ⇔ Cvi =λR,ivi,
for i = 1, 2, …, M
 So vi =GuR,i∈N^2×M and it equals first M eigenvectors of C
Calculation of vi only needs computing M uR,i vectors, and
multiplying them with G
Note that V (G) = span(v1, v2, …, vM)
“face” space = eigenspace of C = span(v1, v2, …, vM)
UEE 1102: Engineering
Mathematics: Linear Algebra
7
Idea of data projection
3D space
UEE 1102: Engineering
Mathematics: Linear Algebra
2D space
8
Eigenface Testing – Facial Identification

Note that not all M vi’s are needed



Transform training images as viTgi = gT,i, for i = 1,2, …, M’
Form the coefficient vector gT = [gT,1, gT,2, …, gT,M’]T using all
images





Only M’ < M vi’s are retained
For centroid (mean) of images of same person k, called gT,k
Suppose a new face point h appears (already zero-mean)
Classifying a new face by first computing its coefficient viTh =
hT,i, for i = 1, 2, …, M’
Form coefficient vector hT = [hT,1, hT,2, …, hT,M’]T
Decide hT is person gT,k,opt if gT,k,opt = argmingT,k || hT – gT,k ||22

This is not optimal detector
UEE 1102: Engineering
Mathematics: Linear Algebra
9
Principle of min norm detector
1. Compute distances to all
training data points
testing
sample
2. Pick the smallest
gT,k,opt = argmingT,k || hT – gT,k ||22
training
sample
UEE 1102: Engineering
Mathematics: Linear Algebra
10
Probability of wrong detection vs SNR
Why curve is monotonically decreasing?
UEE 1102: Engineering
Mathematics: Linear Algebra
11
Assignment

Complete functions of





Obtain smooth and correct miss detection probability vs. SNR curve
using all of the eigenvectors from the covariance matrix (20%)
Implement algorithm using




Image I/O (10%)
Efficient computation of singular vectors(15%)
Min norm identification(15%)
10% of the total number of eigenvectors (10%)
1% of the total number of eigenvectors (10%)
Answer question: how results for reduced number of eigenvectors
are different from taking full set?(10%)
Plot miss probability error vs. SNR curve for all three cases(full set,
10% and 1% of the total number of eigenvectors).(10%)
UEE 1102: Engineering
Mathematics: Linear Algebra
12
Good luck!
UEE 1102: Engineering
Mathematics: Linear Algebra
13
Download