A No-Reference Texture Regularity Metric Based on Visual Saliency ABSTRACT: This paper presents a no-reference perceptual metric that quantifies the degree of perceived regularity in textures. The metric is based on the similarity of visual attention (VA) of the textural primitives and the periodic spatial distribution of foveated fixation regions throughout the image. A ground-truth eye-tracking database for textures is also generated as part of this paper and is used to evaluate the performance of the most popular VA models. Using the saliency map generated by the best VA model, the proposed texture regularity metric is computed. It is shown through subjective testing that the proposed metric has a strong correlation with the mean opinion score for the perceived regularity of textures. The proposed texture regularity metric can be used to improve the quality and performance of many image processing applications like texture synthesis, texture compression, and content-based image retrieval. EXISTING SYSTEM: Textures are present in almost everything we see around us, both in natural and man-made objects. Barks of trees, leaves, grass, flowers and ripples of water are all examples of natural textures. Tiles on the floor, carpets, and all types of printed fabrics can be cited as examples of man-made textures that we see every day. Each of these objects has a spatially repetitive pattern of visual properties that characterize the specific object and help us in their recognition, classification and segmentation. This repeated pattern of pixelintensities or colors constitute a visual texture. Texture analysis forms the core of many applications such as content-based image retrieval, defect detection on fabrics and texture synthesis. DISADVANTAGES OF EXISTING SYSTEM: Perceptual quality metrics for textures proposed in the past like STSIM , are full-reference metrics used to assess the similarity between a texture and a reference, and cannot quantify the structure in a newly observed texture without the presence of a reference. Furthermore, they cannot assess the degree of regularity in a texture image but rather it’s to a reference texture. PROPOSED SYSTEM: Our paper proposing a no-reference perceptual metric that predicts the degree of perceived regularity in textures. This is the first ever work that quantifies the amount of perceived regularity in textures. Another major novelty of this approach is incorporating the Visual Saliency Map (VSM) of a texture to compute an objective texture regularity metric for the considered texture. In order to use the right VA model for generating the VSM, we compare about 9 popular VA models and select the best model through a set of established performance metrics. Though there have been works like, which evaluate the VA models for their accuracy in predicting the true saliency on natural images, no prior work deals with benchmarking the performance of VA models on images with exclusive texture content. In this work, we address this problem by building a texture database and systematically capturing the corresponding ground-truth saliency maps. A preliminary version of the proposed texture regularity metric appeared in S. Varadarajan and L. J. Karam, “A no-reference perceptual texture regularity metric,” in Proc. IEEE Int. In this journal, we have improved on our earlier work by modifying the similarity and placement regularity scores of the texture regularity metric. We also show that the proposed metric has a stronger correlation with subjective regularity scores than the Pattern Regularity metric. In addition, the proposed regularity metric is also evaluated for robustness against small geometric and photometric transformations and found to be robust. ADVANTAGES OF PROPOSED SYSTEM: The regularity metric can be further improved by applying better VA models that more closely predict the effect of texture regularity on human visual saliency. To reduce the search-space in content-based image retrieval applications. SYSTEM ARCHITECTURE: SYSTEM REQUIREMENTS: HARDWARE REQUIREMENTS: System : Pentium IV 2.4 GHz. Hard Disk : 40 GB. Floppy Drive : 1.44 Mb. Monitor : 15 VGA Colour. Mouse : Logitech. Ram : 512 Mb. SOFTWARE REQUIREMENTS: Operating system : Windows XP/7. Coding Language : MATLAB Tool MATLAB R2013A : REFERENCE: Srenivas Varadarajan, Student Member, IEEE, and Lina J. Karam, Fellow, IEEE, “A No-Reference Texture Regularity Metric Based on Visual Saliency”, IEEE TRANSACTIONS SEPTEMBER 2015. ON IMAGE PROCESSING, VOL. 24, NO. 9,