Image Classification Methods Nishana Rasheed , Shreeja.R

advertisement
International Journal of Engineering Trends and Technology (IJETT) – Volume 8 Number 8- Feb 2014
Image Classification Methods
Nishana Rasheed#1, Shreeja.R*2
#
*
Department of Computer Science and Engineering, MES College of Engineering, Kuttippuram, Calicut University, India
Department of Computer Science and Engineering, Assistant Professor, MES College of Engineering, Kuttippuram, Calicut
University, India
Abstract— The problem of image classification has aroused
considerable research interest in the field of image processing.
Traditional methods often convert an image to a vector and then
use a vector-based classifier. A novel multiple rank regression
model (MRR) for matrix data classification is used. Unlike
traditional vector-based methods, multiple left projecting vectors
and right projecting vectors are employed to regress each matrix
data set to its label for each category. The convergence behavior,
initialization, computational complexity of different classification
methods are analysed.Compared with vector-based regression
methods, MRR achieves higher accuracy and has lower
computational complexity. Compared with traditional supervised
tensor-based methods, MRR performs better for matrix data
classification.
is only approximated locally and all computation is deferred
until classification. The k-nearest neighbor algorithm is
amongst the simplest of all machine learning algorithms. An
object is classified by a majority vote of its neighbors, with
the object being assigned to the class most common amongst
its k nearest neighbors (k is a positive integer, typically small).
Keywords— pattern recognition, computer vision, classification,
rank of regression, fitting error, learning capacity, singularity
problem.
INTRODUCTION
Classification algorithms are based on the assumption that
image depicts one or more features and each of these features
belong to one of the several distinct and exclusive classes.
Image such as face images, palm images, or MRI[8] data are
usually represented in the form of data matrices. Additionally,
in video data mining, the data in each time frame is also a
matrix. How to classify this kind of data is one of the most
important topics for both image processing and machine
learning. In literature survey several classification approaches
have
been
proposed
such
as
KNN,SVM,1DREG,LDA,2DLDA,GBRand MRR .
If k = 1, then the object is simply assigned to the class of that
single nearest neighbor.
Fig 1 K-Nearest Neighbor Classifier
B. Support Vector Machine
LITERATURE SURVEY
A. K- Nearest Neighbour Classifier (KNN)
K-NearestNeighbourhood classifier (KNN)[1] by G.
Shakhnarovich which is similarity based. In pattern
recognition, the k-nearest neighbors algorithm (KNN) is a
non-parametric method for classification and regression, that
predicts objects' "values" or class memberships based on the k
closest training examples in the feature space. KNN is a type
of instance-based learning, or lazy learning where the function
ISSN: 2231-5381
Support Vector Machine (SVM) [2] by V. N. Vapnik
which is margin based. In SVM a data point is viewed as a
p-dimensional vector and separate such points with a (p 1)-dimensional hyperplane. This is called a linear
classifier. There are many hyperplanes that might classify
the data reasonable choice as the best hyperplane is the
one that represents the largest separation, or margin,
between the two classes .
http://www.ijettjournal.org
Page 461
International Journal of Engineering Trends and Technology (IJETT) – Volume 8 Number 8- Feb 2014
In fig(3) Direction W is taken such that both differences
between the class
means projected on to these directions µ1 and µ2 is large and
variance(σ1
and σ1 ) around these mean is small
.
Fig 2 Support Vector Machine
C. One Dimensional Regression(1DREG)
Among these approaches, due to their simplicity, effectiveness
and inductive nature one dimensional Regression methods (
denoted as 1DREG ) [3] by C. M. Bishop regression methods
have been widely used in many real applications. 1DREG is a
representative method in vector-based regression works. It is
also a famous model for classification. Denote the matrix data
Xi(ith training matrix data) as an mn- dimensional vector data
xi by connecting each row (or column). 1DREG aims to
regress each data to its label vector by computing c
transformation vectors and constants.1DREG converts the
matrix data into a vector. Thus, it will lose the correlation of
matrix data and its computational time consuming is
unacceptable if the matrix scale is large.
.
D .General Bilinear Regression(GBR)
As mentioned in [6], GBR is the two-dimensional counterpart
of 1DREG. It replaces the regression function of 1DREG by a
bilinear regression function Besides, it only uses one left
projecting vector together with one right projecting vector. Its
fitting error is too large for some real regression problems.
E .Linear Discriminant Analysis(LDA)
Linear Discriminant Analysis projects data on to a lower
dimensional vector space such that the ratio of between-class
distance to within class distance is maximized thus achieving
maximum discrimination between classes. It suffers from
singularity problem. It is based on maximizing the distance
means of the classes
Fig3 Linear Discriminant Analysis..
F. Two Dimensional Linear Discriminant Analysis (2DLDA)
Two-dimensional Linear Discriminant Analysis (2DLDA) [4]
by J. Ye, R. Janardan is a popular supervised tensor based
approach. 2DLDA aims to find two transformation matrices L
and R,which map Xi to its low dimensional embedding, i.e.,
Zi, by Zi = LTXiR. It tries to minimize the within-class
distance Dw and maximize the between-class distances Db.
G. Multiple Rank Regressions (MRR)
Multiple rank regression model (MRR)[5] is meant for matrix
data classification . Unlike traditional vector-based methods,
multiple-rank left projecting vectors and right projecting
vectors to regress each matrix data set to its label for each
category is employed. MRR achieves higher accuracy and has
lower computational complexity. Compared with traditional
supervised tensor-based methods, MRR performs better for
matrix data classification.
J(W ) = argmaxW (m1 − m2 )2 /(σ1 + σ1 ).
ISSN: 2231-5381
http://www.ijettjournal.org
Page 462
International Journal of Engineering Trends and Technology (IJETT) – Volume 8 Number 8- Feb 2014
TABLE 2
ACCURACY OF DIFFERENT CLASSIFICATION METHODS
Fig 4 Multiple Rank Regression
MRR is a combination of multiple two category classifiers via
one versus rest strategy. More concretely, in training the
classifier for the r-th category, the labels for the points who
belong to the r-th category are one. If a point does not belong
to this class, its label is zero. In the proposed system multiplerank left projecting vectors and right projecting vectors are
employed to regress each matrix dataset to its label for each
category .Compared with traditional supervised tensor-based
methods, MRR can performs better for matrix data
classification. Computational complexity is more for
uncorrelated data in this method.MRR can be extended for
unsupervised and semi supervised cases.
PERFORMANCE ANALYSIS
TABLE 1
PERFORMANCE ANALYSIS BASED ON DIFFERENT PARAMETERS
Parameter
1DREG
GBR
MRR
Rank of
Regression
Fitting Error
Largest
Weakest
Moderate
Smallest
Too large
Trade off
Learning Capacity
Strongest
Weakest
Trade off
Speed
Slowest
Fastest
Moderate
Accuracy
Moderate
Moderate
Highest
ISSN: 2231-5381
TABLE 3
COMPUTATIONAL TIME OF DIFFERENT CLASSIFICATION METHODS
CONCLUSIONS
It was found that LDA suffered from singularity problem .By
applying intermediate dimension reduction singularity
problem was reduced. No singularity problem was seen in
2DLDA.1DREG suffered from over fitting problem. 1DREG
has smallest fitting error hence strongest learning
capacity.GBR has weakest capacity for learning but strongest
capacity for generalization. MRR for data matrix classification
is different from traditional regression approaches which
reformulate the matrix data into a vector and use only one
projecting vector, we have used several left projecting vectors
and the same number of right projecting vectors to regress
each matrix data to its label for each category. Essentially,
MRR can be regarded as a trade-off between the capacity of
learning and generalization for regression. MRR not only
achieve satisfied accuracy, but also has low computational
complexity.
http://www.ijettjournal.org
Page 463
International Journal of Engineering Trends and Technology (IJETT) – Volume 8 Number 8- Feb 2014
REFERENCES
[1] ]Salim Bettahar, Amine Boudghene Stambouli, Patrick Lambert,
“Nearest Neighbor methods in Learning and vision: Theory and
practice MIT Press-2006.
[2] V. N. Vapnik, "The Nature of Statistical Learning Theory. ", New
York: Springer-Verlag, 1995.
[3] C. M. Bishop, "Pattern Recognition and Machine Learning ",
Secaucus, NJ: Springer Verlag, 2006.
[4] J. Ye, R. Janardan, and Q. Li, "Two dimensional linear discriminant
analysis ", in Advances in Neural Information Processing Systems.
Cambridge, MA: MIT Press, 2004.
[5] Chenping Hou, Member, IEEE, Feiping Nie, Dongyun Yi, and Yi
Wu , "Efficient Image Classification via Multiple Rank Regression ",
IEEE Transaction On Image Processing, January 2013.
[6] K. R. Gabriel, Generalised bilinear regression ", Biometrika, vol.
85,no. 3, pp. 689700, 1998.
[7] H. GOVIL , M. KUMAR and S. FAROOQ , “Comparative
evaluation of fuzzy based object oriented image classification method
with parametric and non-parametric classifier", 14th annual conference
on geospatial information technology and application,2012.
[8] A. W.-K. Kong, D. D. Zhang, and M. S. Kamel, ”A survey of
palmprint recognition ", Pattern Recognition” , vol. 42, no. 7, pp.
14081418, 2009
.
ISSN: 2231-5381
http://www.ijettjournal.org
Page 464
Related documents
Download