Robust Non-Negative Factorization using L11 norm and its application in image clustering

advertisement
Robust Non-Negative Factorization
using L11 norm and its application in
image clustering
Kai LIU
Outline
•
•
•
•
1.
2.
3.
4.
Introduction of NMF
Why we need robust NMF
My theoretical work
Experiments and results
Introduction
• Non-negative Matrix Factorization has good
interpretation:
Why Robust NMF?
• Traditional NMF:
• Object :
2
min X  FG F st.G T * G  I
• Using the least square error as the object, the
function is unstable with noise, outliers.
Outliers
My method
• Change the Objective to:
min X  FG 1,1 st.G T * G  I
• The reason is the object is the sum of absolute
value of every entry in X-F*G, while traditional
NMF aims to minimize the square
Algorithm
• Updating algorithm:
• Where
The idea
• 1. First to prove
X
1,1
 trace( X * D * X T )
• 2. take derivative of the trace with respect to F
and G(here X = X – F*G and X is constant)
• 3. prove the convergence of the algorithm
• 4. prove the correctness of the algorithm
Experiments on ATNT
• L2 algorithm:
Experiments on ATNT
• L1 algorithm:
Experiments on Caltech 101
• We do the image clustering on Caltech 101,
and here is a table(clustering accuracy) we
get:
L1
L2
kmeans
Set 1
0.4833
0.4167
0.4333
Set 2
0.5167
0.4167
0.4667
Set 3
0.5167
0.3833
0.4000
Set 4
0.5000
0.4167
0.4500
Set 5
0.4500
0.3833
0.3833
Set 6
0.4833
0.4167
0.4000
Set 7
0.4833
0.4333
0.4333
Set 8
0.5833
0.4833
0.5167
Set 9
0.4333
0.3833
0.3833
Conclusion
• By using L1 norm NMF, the feature(F) we get is
better(more clear) than that of L2 norm
• L1 NMF in many situations have better
clustering accuracy(G) than L2 norm NMF and
kmeans.
Questions and Answers
Thank you!
Download