Document 16084567

advertisement
FINGERPRINT IDENTIFICATION FOR CYRIX MEDIAGX BASED EMBEDDED SYSTEM
Ziad S Siddique
B.S., The University of Mississippi, 2006
PROJECT
Submitted in partial satisfaction of
the requirements for the degree of
MASTER OF SCIENCE
in
COMPUTER ENGINEERING
at
CALIFORNIA STATE UNIVERSITY, SACRAMENTO
SPRING
2010
FINGERPRINT IDENTIFICATION FOR CYRIX MEDIAGX BASED EMBEDDED SYSTEM
A Project
by
Ziad S Siddique
Approved by:
__________________________________, Committee Chair
Jing Pang, Ph. D.
__________________________________, Second Reader
Preetham Kumar, Ph. D.
____________________________
Date
ii
Student: Ziad S Siddique
I certify that this student has met the requirements for format contained in the University format
manual, and that this project is suitable for shelving in the Library and credit is to be awarded for
the Project.
__________________________, Graduate Coordinator
Suresh Vadhva, Ph. D.
Department of Computer Engineering
iii
________________
Date
Abstract
of
FINGERPRINT IDENTIFICATION FOR CYRIX MEDIAGX BASED EMBEDDED SYSTEM
by
Ziad S Siddique
Nowadays portable devices like cell phones, smart phones, PDA, net books and laptops
have become household items. People, for their own convenience love taking computers
or personal data with them at all times. Having these portable devices with us enables
frequent access to mails, lookup someone’s information in an easily accessible and
searchable form and much more information we need can be obtained instantly. The very
portability of these devices presents a problem: In the case of any of these devices getting
lost or stolen, it will provide other people access to our emails, photos and other personal
information.
Biometric Fingerprint readers offer an ideal solution to this problem. It is much
better than very secure password, and a fingerprint match is required before gaining
access to information in specific devices. In this project, I have researched and
implemented a Fingerprint Identification Process application for a specific embedded
system. In conclusion, this project identified mechanisms and features of Fingerprint
Identification that can be utilized for application and implementation in specific systems.
_______________________, Committee Chair
Jing Pang, Ph. D.
_______________________
Date
iv
ACKNOWLEDGMENT
First, I would like to extend my gratitude to my advisor Dr. Jing Pang for the invaluable
advice and positive encouragement she provided throughout the course of this project.
She inspired me greatly to work on this project. Special thanks to Dr. Preetham Kumar
for proofreading this dissertation. Besides, I would also like to thank The Department of
Computer Engineering for providing me a good environment and facilities to complete
this project successfully.
v
TABLE OF CONTENTS
Page
Acknowledgment .......................................................................................................... v
List of Tables ............................................................................................................. vii
List of Figures ........................................................................................................... viii
Chapter
1. INTRODUCTION .................................................................................................. 1
2. CYRIX MEDIAGX ARCHITECTURAL SYSTEM OVERVIEW ....................... 2
2.1 Cyrix MediaGX Processor Overview ......................................................... 4
2.2 Cyrix MediaGX Cx5510 Processor Overview ........................................... 4
2.3 Compatible Operating System and Compiler ............................................. 5
3. FINGERPRINT IDENTIFICATION PROCESS IMPLEMENTATION ............. 6
3.1 Design Flow ................................................................................................ 7
3.2 BMP File Format ........................................................................................ 8
3.3 Image Normalization ............................................................................... 11
3.4 Edge Detection .......................................................................................... 15
3.5 Image Binarization .................................................................................... 18
3.6 Noise Filter................................................................................................ 20
3.7 Image Thinning ........................................................................................ 23
3.8 Minutiae Extraction ................................................................................. 28
3.9 Experimental Result .................................................................................. 32
4. LIMITATIONS .................................................................................................... 33
5. CONCLUSIONS................................................................................................... 35
Bibliography ............................................................................................................... 36
vi
LIST OF TABLES
Page
1.
Table 1 BMP (windows) header format (54 bytes) ......................................... 9
2.
Table 2 5X5 Laplacian of gaussian convolution kernel ................................. 17
3.
Table 3 3X3 Window frame for noise filter ................................................... 20
4.
Table 4 Properties of crossing number .......................................................... 29
5.
Table 5 Experimental result of minutiae extraction ....................................... 32
vii
LIST OF FIGURES
Page
1.
Figure 1 Cyrix mediagx system architecture ................................................... 2
2.
Figure 2 Fingerprint identification design flow ............................................... 7
3.
Figure 3 Bitmap image format ......................................................................... 8
4.
Figure 4 Data offset starting location equation .............................................. 10
5.
Figure 5 Histogram distribution of grayscale image ...................................... 11
6.
Figure 6 Weighted arigthmatic mean ............................................................. 12
7.
Figure 7 Population variance ......................................................................... 12
8.
Figure 8 Image normalization equation ......................................................... 13
9.
Figure 9 Normalized bitmap image ............................................................... 14
10.
Figure 10 Graphical representation of laplacian method ............................... 15
11.
Figure 11 Laplacian equation ......................................................................... 16
12.
Figure 12 Laplacian of gaussian equation ..................................................... 16
13.
Figure 13 Edge detected bitmap image .......................................................... 17
14.
Figure 14 Bi-modal intensity distribution ...................................................... 18
15.
Figure 15 Binarized bitmap image ................................................................. 19
16.
Figure 16 Noisy white pixel conversion ........................................................ 20
17.
Figure 17 Noisy black pixel conversion ........................................................ 20
18.
Figure 18 Eight combination to determine white pixel validity .................... 21
19.
Figure 19 Eight combination to determine black pixel validity .................... 21
20.
Figure 20 Filtered bitmap image .................................................................... 22
viii
21.
Figure 21 3X3 Window frame for hilditch algorithm .................................... 23
22.
Figure 22 Eight combinations for 1st level hilditch algorithm ....................... 24
23.
Figure 23 Eight combinations for 2nd level hilditch algorithm ...................... 24
24.
Figure 24 Cases to retain pixel value ............................................................. 25
25.
Figure 25 Thinned bitmap image ................................................................... 27
26.
Figure 26 3X3 window frame for minutiae extraction .................................. 28
27.
Figure 27 Crossing number equation ............................................................. 28
28.
Figure 28 Example of crossing number properties ........................................ 29
29.
Figure 29 Example of false minutiae structure .............................................. 30
30.
Figure 30 Angle shown for minutiae point .................................................... 30
31.
Figure 31 Polar coordinates equation ............................................................ 31
32.
Figure 32 Thinned bitmap image for minutiae extraction ............................. 31
ix
1
Chapter 1
INTRODUCTION
Nowadays portable devices like cell phones, smart phones, PDA, net books and laptops
have become common household items. Having these portable devices with us at all
times enables frequent access to mails, lookup information in an easily accessible and
searchable form, and much important information that we need for daily usage can be
obtained instantly. However, due to the portability of these devices, in the case of losing
or getting any of these devices stolen, it can provide others access to our emails, photos
and many other personal information.
Biometrics offers an ideal solution to this problem. Biometrics takes our unique
physical characteristics and uses them to identify our identification and verification. It is
much better than even very secure passwords, and an identity match is required before
one can gain access to the information in specific devices. This project looked into
biometrics Fingerprint Identification Process implementation for a specific embedded
system.
As a first phase of the project, I researched the architectural overview of Cyrix
MediaGX based embedded system. In addition, I researched and tried installing different
operating systems that would be compatible with the instruction set of the system.
During the second phase of the project, I identified and implemented fingerprint
identification that included sequence of image enhancement and identity point collection
for better verification results.
2
Chapter 2
CYRIX MEDIAGX ARCHITECTURAL SYSTEM OVERVIEW
The introducing of the Cyrix MediaGX system in February 1997 established a new class
of low cost, high performance PC architectures [1]. The general principle is that more
processing that occurs in PC’s CPU side, the more efficient the overall system
performance. The MediaGx architecture integrates graphics and audio functions, PCI
Interface and memory control unit into processor unit. This feature eliminates potential
system conflicts and end user configuration problems. The MediaGX system consists of
two chips – the Cyrix MediaGX processor and Cyrix MediaGX Cx5510 companion chip
[2]. The audio and video functions of the two chips operate under the control of the
Virtual System Architecture (VSA) design.
Figure 1: Cyrix mediagx system architecture [2]
3
In figure 1, it shows the architectural overview of a Cyrix MediaGX based system. Where
communication is established between CPU and companion chip Cx5510 through PCI
Bus. A 64 bit Data Bus is present to access EDO DRAM memory. In addition,
companion chip Cx5510, connects to the I/O devices through ISA and IDE Bus.
4
2.1 Cyrix MediaGX Processor Overview:
MediaGx processor is an x86 compatible processor that directly communicates with PCI
interface and EDO DRAM over a dedicated 64-bit Data Bus. High quality SVGA is
provided by advance graphics accelerator on the MediaGX processor [2]. Graphics are
handled by dedicated pipeline on the CPU itself and display controller is also located at
main processor. Cyrix MediaGX processor core operates at 3.3 or 3.6 volt power supply.
Among other features, it includes [2]




PR (Performance Rating)-133 performance at 133 MHz
Integrated Floating Point Unit
16-KByte Unified L1 Cache
64-bit FPM/EDO DRAM Controller
2.2 Cyrix MediaGX Cx5510 Processor Overview:
Cyrix MediaGx Cx5510 is a new generation integrated, single chip controller for
MediaGX line of x86 compatible. This companion chip houses the audio controller and
uses Virtual System Architecture software to mimic the functionality of industry standard
audio chips. The Cx5510 is a bridge to ISA bus, IDE and I/O ports [2]. It bridges
MediaGX processor over PCI Bus to ISA Bus and performs traditional chipset functions.
5
2.3 Compatible Operating System and Compiler:
Cyrix MediaGX is a x86 compatible processor. However, it had only a 486 compatible
instruction set. Any software or operating system requiring Pentium’s new instruction set
will not work on Cyrix MediaGX based system [3]. By taking into consideration the
above situation, operating system including windows XP and later versions are not
compatible with MediaGX process.
Microsoft Windows 3.11 workgroups is the most
suitable and appropriate operating system to accomplish the tasks and obtain desired
results for this project. Windows for workgroups version 3.11 was developed mainly for
embedded system market [4]. It was also required to choose appropriate compiler
integrated development environment (IDE) for windows 3.11 compatible application
development. Borland c++ 3.1 compiler/IDE was chosen for software development to
serve this purpose. The integrated development environment was used to implement the
Fingerprint Identification Process.
6
Chapter 3
FINGERPRINT IDENTIFICATION PROCESS IMPLEMENTATION
Fingerprints are the oldest and most widely used biometric form for identification
because of their high acceptability [5]. There have been considerable amount of interests
and rapid advanced research and development in the field of pattern recognition
automation for last three decades. Today, the use of computers in fingerprint
identification is highly desirable in many applications. Building security systems and
work related to law-enforcement are most common examples of usage of fingerprint
identification and recognition process [6].
This project focuses on the software
implementation of fingerprint identification process. The most commonly used Minutiae
Based Fingerprint Identification Process is introduced as part of software development.
Series of image enhancement and Minutiae Extraction steps are included in the process
that can be classified as following phases [5]:
1.
2.
3.
4.
5.
6.
Histogram Normalization
Edge Detection
Image Binarization
Noise Filter
Image Thinning
Minutiae Extraction
A specific image format, BMP file format is taken into consideration for this project
work.
7
3.1 Design Flow:
Input
Bitmap Image (.BMP)
Apply Histogram Normalization
Normalized
Bitmap Image (.BMP)
Apply Edge Detection
(Laplacian Algorithm)
Edge Detected
Bitmap Image (.BMP)
Apply Binarization
Binarized
Bitmap Image (.BMP)
Apply Noise Filter
Filtered
Bitmap Image (.BMP)
Apply Thinning
(Hilditch Algorithm)
Thinned
Bitmap Image (.BMP)
Apply Minutiae Extraction
Algorithm
Final
Minutiae Extracted
Data File (.TXT)
Figure 2: Fingerprint identification design flow
8
3.2 BMP File Format
Windows bitmap files are stored in a device independent format so that the bitmap can be
displayed in any display device. The term “Device Independent” means bitmap specifies
pixel color in a form which is independent of the method used by display to represent
color [7]. By default a windows bitmap file extension is .BMP. Each individual Image
file formal is structured uniquely. In the process of image manipulation, it is very
important to understand a particular image file format. Image file format consists of a
header and data offset information. BMP image file format is structured with bitmap file
header, bitmap information header, a color table, and array of bytes representing the
bitmap bits also known as data offset [8].
Figure 3: Bitmap image format [8]
9
offset
Size(bytes)
Description
0
2
Signature
2
4
size of BMP file
6
2
Reserved
8
2
Reserved
10
4
offset to start of image data in bytes
14
4
size of BITMAPINFOHEADER structure, must be 40
18
4
image width in pixels
22
4
image height in pixels
26
2
number of planes in the image
28
2
number of bits per pixel
30
4
compression type
34
4
size of image data in bytes
38
4
horizontal resolution in pixels per meter
42
4
vertical resolution in pixels per meter
46
4
number of colors in image
50
4
number of important colors
Table 1: BMP (windows) header format (54 bytes) [9]
10
A part of function used to retrieve FileSize, ImageWidth, ImageHeight,NumberOfColors:
fseek(inputFile, offset, SEEK_SET);
for(i=1; i<=size; i++)
{
fread(charPtr, sizeof(char), 1, inputFile);
value = (long)(value + (*charPtr)*(pow(256, (i-1))));
}
As shown in the above function, “SEEK_SET” is a macro pointing to the starting
location of a bitmap file; “size” is the total size in bytes as mentioned in table 1.
Depending on the size of each data “charPtr”, multiplied with the power of 8 bits per
pixel image color (256) points to the address location of the desired data and sum of
“value” is required information such as, FileSize or ImageHeight. The value returned
from this function is utilized to manipulate .bmp image.
BMP Data Offset:
The bpp (bits per pixel) as shown in table 1, determines the bits that represents each pixel
and maximum number of colors in a bitmap. 8 bits per pixel gray bitmap image value is
used for Fingerprint Identification process. 8 bits per pixel has maximum of 256 colors.
Each pixel represents 1 Byte in the color table [7]. The formula to determine the starting
address of pixel data offset is as follows:
14 + 40 + 4 * number of colors = starting address of Data Offset
Figure 4: Data offset starting location equation [8]
(3.2.1)
11
3.3 Image Normalization:
Image enhancement is one of the very important procedures in fingerprint Identification
Process. Image normalization is well known in fingerprint enhancement process. It is
necessary to apply normalization on overexposed (too bright) or underexposed (too light)
gray scale input image [10]. The aim of this process is to standardize the intensity of an
image by adjusting the grey level value so that it lies within desired range of values [11].
Normalization factor is calculated according to the Mean and Variance of an image.
Different grayscale image input images have different Mean and Variance depending on
the lighting conditions. In order to change the appearance of the image several
calculations are performed to approximate desired Mean and Variance [10]. It is required
to construct a histogram with pixel intensity values for obtaining desired mean and
variance for one given input image. The histogram distribution shown in figure 5 is
obtained by collecting pixels intensity for a given gray scale input image.
Figure 5: Histogram distribution of grayscale image [12]
12
In an image-processing context, histogram normally refers to the graphical distribution of
pixel intensity values. There are 256 different possible intensities for 8 bits per pixel
grayscale image. The histogram will graphically display those 256 numbers with the
distribution of pixels amongst those grayscale values. The histogram algorithm operates
by first reading the grayscale value with pixel intensity 0 to 255 at the first entry [12].
The array of pixel intensity counts total number of pixels that has the same value. Then
the frequency is calculated diving the counter values for each pixel by the total number of
pixels.
Mean and Variance value calculation can be achieved using the mean and variance
arithmatic equations on the collected histogram data.
 in 1Wi   i

 in 1Wi
(3.3.1)
Figure 6: Weighted arithmatic mean [13]
As shown in the figure 6, representing it into code:
Mean = 0;
for(i=1; i<255; i++)
Mean += i*pixelFreqHist[i];
Mean = (Mean/(imageWidth*imageHeight));


1
  
i 1 i


 
2



(3.3.2)
Figure 7: Population variance [14]
13
As shown in the figure 7,  is the Variance and representing it into code:
Sigma = 0;
for(i=0; i<255; i++)
Sigma += pixelFreqHist[i]*(i-Mean)*(i-Mean);
Sigma /= (imageWidth*imageHeight);
Sigma = sqrt(dSigma);
The process normalizes every pixel according to the following equation:
i,


 
0
j   

 

0



if  I i, j    

V

2

V0 I i, j   
otherwise 
V

V0 I i, j    2



(3.3.3)
Figure 8: Image normalization equation [10]
where, I(i.j) represents the gray level value of actual image data, pixel(row,col). M and V
are estimated Mean and Variance respectively. Mo and Vo are desired Mean and
Variance respectively. As shown in figure 8, implementing the equation in terms of code
base:
//desired mean = mean0 variance=sigma0
mean0 = 128;
sigma0 = 128;
coeff = sigma0/sigma;
for(i=0; i<=imageHeight-1; i++)
{
for(j=0; j<=imageWidth-1; j++)
{
//normalized equation
normalizedValue = mean0 + coeff*( pixelData - mean0);
if(normalizedValue<0) normalizedValue = 0;
else if (normalizedValue>255) normalizedValue = 255;
}
}
14
Normalization does not change the ridge structure of an image. It is performed to
standardise the dynamic level of variation in gray level values.
The resulting image
shown in Figure 9 is obtained by applying histogram normalization on a grayscale image.
Input: Grayscale image
Output: Normalized image
Figure 9: Normalized bitmap image
15
3.4 Edge Detection:
Edges characterizes image boundaries and it is very important in image processing.
Edges are the areas in an image with high intensity contrasts, change of intensity in pixel
value from one pixel to neighboring pixels. Image Edge Detection preserves the
structural properties of an image by reducing large amount of variable data and filtering
out useless information [15]. The result of successful implementation of edge detection
substantially simplifies the task of interpreting information from original image.
However, it is not always possible to obtain ideal edge detection for moderately complex
images. There are many ways to perform Edge Detection. Majority of the different
methods are categorized into group in two ways, either gradient and Laplacian [15].
The Gradient Method detects the edges by looking for the maximum and minimum value
from the first derivative of the image. Laplacian Method searches for the zero crossing on
the second derivative of images to find the edges.
Actual Data Point
First Order Derivative
Second Order Derivative
Figure 10: Graphical representation of laplacian method [15]
16
Laplacian of Gaussian filter is also known as convolution filter, and is implemented to
detect edges for a given image. This filter applies Gaussian blur in first level, then
Laplacian filter and finally checks for zero crossings [16]. Highlighted edges are result of
this filter. The Laplacian of the Gaussian filter operator takes single grayscale image as
input and generates an output close to binary image [16].
Laplacian L(x, y) operator of an image with pixel intensity of I(x, y) is shown below:
Lx, y  
2I
x
2

2I
y
(3.4.1)
2
Figure 11: Laplacian equation [16]
Laplacian of Gaussian can be pre-calculated in advance and only one convolution filter
will be applied on the image during run time. 2-D Laplacian of Gaussian function
centered on zero with standard deviation  has the form:

1 
LoGx, y   
4
 
1
x 2  y 2 
2
2
e


x2  y2
2 2
(3.4.2)

Figure 12: Laplacian of gaussian equation [16]
Where LoG is a 2-D Laplacian of Gaussian function with coordinates x, y and  is the
standard deviation.
17
The 2-D Laplacian of Gaussian (LoG) can be approximated by a 5X5 convolution kernel,
such as:
0
0
1
0
0
0
1
2
1
0
1
2
-16
2
1
0
1
2
1
0
0
0
1
0
0
Table 2: 5X5 Laplacian of gaussian convolution kernel [17]
The code base implementation of edge detection using Laplacian of Gaussian
convolution kernel as shown in table 3:
for(row=0; row<=(imageHeight-1); row++) {
for(col=0; col<=(imageWidth-1); col++) {
//convolution starts here
for(x=-2; x<=2; x++) {
for(y=-2; y<=2; y++) {
//applying Laplacian of Gaussian convolution kernel
laplacianValue = laplacianValue +
pixel[((x+row)*imageWidth)][y+col] * LoG[x+2][y+2]);
}
}
if(laplacianValue>255) laplacianValue=255;
if(laplacianValue<0)
laplacianValue=0;
}
}
Image obtained from Laplacian of Gaussian Edge Detection:
Input: Normalized image
Output: Edge detected image
Figure 13: Edge detected bitmap image
18
3.5 Image Binarization:
Fingerprint Binarization is a very important part of Fingerprint Identification System.
Minutiae Extraction algorithm is applied on a binary image with two level of intersection:
the black pixel represents valleys and the white pixel represents ridges [11]. Binarization
is the process of converting a gray image to binary image with only black and white pixel
values, which improves the contrast between ridges and valleys [11]. Threshold Value is
applied on the gray image for the binary image conversion. If the pixel intensity value is
higher than the Threshold, it will be set to 255 (White). Otherwise, it will be set to 0
(Black) [18]. In the output binary image, black pixels correspond to background and
white pixels correspond to foreground. It is possible to segment an image by checking the
intensity histogram of the image. The intensity of pixels within foreground objects is
different from the intensity of pixels within background. In this case, distinct peaks are
expected in the histogram corresponding to foreground objects such that a threshold can
be chosen to isolate the foreground and background objects accordingly.
Figure 14: Bi-modal intensity distribution [19]
19
The resulting image after applying binarization with thresholding calculation for a given
image is shown as follows:
Input: Edge detected grayscale image
Output: Binarized image
Figure 15: Binarized bitmap image
20
3.6 Noise Filter:
Noise reduction is a process of filtering a specific pixel data offset depending on
surrounding pixel value. In this project, the noise filter is utilized right after converting a
gray image into binary image. A 3X3 matrix positioning is used to filter an image.
Top Left Pixel Data
Left Pixel Data
Bottom Left Pixel Data
Top Pixel Data
Actual Pixel Data
Bottom Pixel Data
Top Right Pixel Data
Right Pixel Data
Bottom Right Pixel Data
Table 3: 3X3 Window frame for noise filter
This positioning process determines the validity of actual pixel data. In case of any pixel
with value 255 (white) surrounded by pixels with values 0 (black), this pixel is
considered as noise or invalid data. For the same reason pixels with 0 value surrounded
by pixels with values 255.
Figure 16: Noisy white pixel conversion
Figure 17: Noisy black pixel conversion
As shown in figure 16, a single white pixel was removed as noise and replaced with a
black pixel. In figure 17, a single black pixel was removed as noise and replaced with a
white pixel. Eight combinations have been taken into consideration to identify a valid
21
white or black pixel value. In 3X3 matrix positioning, the middle location is the current
processed pixel data. A valid pixel value would retain original data if its one or more
surrounding location pixels have the same value as the actual pixel data.
Figure 18: Eight combinations to determine white pixel validity
Figure 19: Eight combinations to determine black pixel validity
22
The algorithms shown in Figures 16, 17, 18 and 19 are represented in the following
codes.
//pixel is actual ridge pixel
//if the value is black
if(pixel == 0 )
{
//verifying if neighboring 8 pixels are black
if(topLeft==0||left==0||bottomLeft==0||top==0||bottom==0
||topRight==0||right==0||bottomRight==0)
pixel = 0; //valid value
else
pixel = 255; //converting the pixel value
}
//if the value is white
if(pixel == 255 )
{
//verifying if neighboring 8 pixels are white
if(topLeft==255||left==255||bottomLeft==255||top==255
||bottom==255||topRight==255||right==255||bottomRight==255)
pixel = 255; //valid value
else
pixel = 0;
//converting the pixel value
}
The image obtained after applying this filter on a binarized image is shown in Figure 20.
Input: Binarized image
Output: Noise reduced image
Figure 20: Filtered bitmap image
23
3.7 Image Thinning:
Image thinning is final image enhancement process implemented prior to Minutiae
Extraction. Thinning, a morphological process applied on a binary image to remove
selected foreground pixels [20]. Thinning algorithm forms skeletonization of a binary
image while preserving the connectivity of the ridge structure [11]. In this mode it is
commonly used to erode away foreground pixels until the images edges are one pixel
wide. Thinning is normally only applied on a binary image and the output is also a binary
image. Different algorithm uses different approaches to solve the problem of thinning.
Thinning algorithms are broadly divided into two groups: iterative and non-iterative [21].
Iterative algorithm is also subdivided into two parts: sequential and parallel. In sequential
iterative algorithm, surrounding pixels help making the decision for single pixel data. On
the other hand, in parallel iterative algorithm pixels are independently judged for
decision-making [22, 23]. This thinning phase is implemented using Hilditch algorithm.
The proposed algorithm belongs to sequential group. The 3X3 window frame is
considered to implement Hilditch algorithm as follows:
Figure 21: 3X3 Window frame for hilditch algorithm [24]
Where, pixel P0 is actual pixel with set of 8 Neighborhoods P1-8.
24
A counter is introduced as part of this algorithm. If the actual foreground pixel is white,
the verified eight combinations are as follows:
255
X
X
X
P0
X
X
X
X
X
X
X
255
P0
X
X
X
X
X
X
X
X
P0
X
255
X
X
X
X
X
X
P0
X
X
255
X
X
X
X
X
P0
X
X
X
255
X
X
X
X
P0
255
X
X
X
X
X
255
X
P0
X
X
X
X
X
255
X
X
P0
X
X
X
X
Figure 22: Eight combinations for 1st level hilditch algorithm [25]
In the figure 22, eight combinations with actual pixel P0 in the center and X represents
don’t care for the individual cases. If it satisfies any of the combination then the counter
is incremented by one for each case. If the counter value falls in between 2 and 6 then the
next combinations verified are as follows:
X
X
X
X
P0
X
0
255
X
X
X
X
X
P0
X
X
0
255
X
X
X
X
P0
255
X
X
0
X
X
255
X
P0
0
X
X
X
X
255
0
X
P0
X
X
X
X
255
0
X
X
P0
X
X
X
X
0
X
X
255
P0
X
X
X
X
X
X
X
0
P0
X
255
X
X
Figure 23: Eight combinations for 2nd level hilditch algorithm [25]
Where clockwise surroundings 255, 0 and X represents white pixel, black pixel and don’t
care with centered actual pixel P0.
25
The counter is set to zero and incremented by one if any of the cases are satisfied as
shown in figure 23. The last stage, which will determine if the actual pixel value is to be
converted from foreground (white) to background (black) pixel is as follows:
P2
P4
P8
P4
P6
P6
(a)
(b)
P2
P8
P2
P8
P4
P6
(c)
(d)
Figure 24: Cases to retain pixel value [25]
As shown in figure 24, if the counter value is one from the previous stage and the
multiple of pixel values located at numbered locations of the four stages are zeros then
the actual foreground pixel is converted to a background pixel. To be more specific if,
P2* P4* P6=0 from stage (a) and P4* P6* P8 =0 from stage (b) or P2* P6* P8=0 from
stage (c) and P2* P4* P8 =0 from stage (d) then foreground pixel white (255) is changed
to background pixel black (0) [25]. The code implementation of this algorithm is as
follows:
// foreground actual pixel
if (actual == 255)
{
count = 0;
//a) checking pixel 1, has 2 to 6 (inclusive) neighbors
if (right == 255) { count++;}
if (bottomRight == 255) { count++;}
if (bottom == 255) { count++;}
if (bottomLeft == 255) { count++;}
if (left == 255) { count++;}
if (topLeft == 255) { count++;}
if (top == 255) { count++;}
if (topRight == 255) { count++;}
26
if ((count >= 2) && (count <= 6 ))
{
count = 0;
//b) starting from 2, go clockwise until 9, and count the
number of 0 to 1 transitions. This should be equal to 1.
if ((topRight == 0) && (right == 255)) { count ++;}
if ((right == 0) && (bottomRight == 255)) { count ++;}
if ((bottomRight == 0) && (bottom == 255)) { count++;}
if ((bottom == 0) && (bottomLeft == 255)) { count++;}
if ((bottomLeft == 0) && (left == 255)) { count++;}
if ((left == 0) && (topLeft == 255)) { count++;}
if ((topLeft == 0) && (top == 255)) { count++;}
if ((top == 0) && (topRight == 255)) { count++;}
if (count == 1 )
{
count = 0;
if (flag == 1)
{
//c) p2*p4*p6=0 (ie either 2,4 ,or 6 is off)
if ((right * bottom * bottomLeft) == 0 )
{
//d) p4*p6*p8=0
if ((bottom * bottomLeft * top) == 0 )
{
actual = 0; //converting pixel data
}
}
flag = 0;
}
else
{
//c) p2*p6*p8=0
if ((right * bottomLeft * top) == 0)
{
//d) p2*p4*p8=0
if ((right * bottom * top) == 0)
{
actual = 0; //converting pixel data
}
}
flag = 1;
}
}
}
}
As mentioned earlier Hilditch is a sequential iterative algorithm, it requires to iteration
through the process shown above three times to achieve desired results.
27
The images obtained in the process of three level thinning are as follows:
1st level Thinning:
Input: Binarized image
Output: 1st level Thinned image
2nd level Thinning:
Input: 1st level Thinned image
Output: 2nd level Thinned image
Final Thinning:
Input: 2nd level Thinned image
Output: Final Thinned image
Figure 25: Thinned bitmap image
28
3.8 Minutiae Extraction:
Minutiae extraction process specifies points on a skeletonized binary image. Most of the
fingerprint scan technologies are based on Minutiae. According to the fingerprint
biometric pattern, after fingerprint image has been thinned, in the final stage the minutia
from a thinned image is extracted. An image post processing is performed followed by
the minutiae extraction to eliminate false minutiae. Most commonly employed method
for Minutiae Extraction, Crossing Number (CN) concept is implemented in this project
[26, 27, 28]. The minutiae are extracted by scanning local neighborhood of each ridge
pixel using 3X3 window frame.
P4
P3
P2
P5
P
P1
P6
P7
P8
Figure 26: 3X3 window frame for minutiae extraction [11]
Where, actual pixel (P) is surrounded by 8 neighboring pixels (P1-8)
Crossing Number of pixel ‘p’ is defined as half sum of the difference between pair of
adjacent pixels defining neighborhood pixels of ‘p’, by representing it mathematically
cnP  
1
  val  P

 val  P
i
mod
8
i

1



2 i  1..8 
(3.8.1)
Figure 27: Crossing number equation [11]
Where P0 to P7 pixels belong to an ordered sequence of pixels defining 8 neighboring
pixels and val(p) is the pixel value.
29
Using the properties of Crossing Number (CN) as shown in the table 1, ridge pixel can be
classified as ridge ending, bifurcation or no minutiae point. A ridge pixel with CN value
of one corresponds to ridge ending and CN value three corresponds to bifurcation.
CN
1
2
3
Property
Ridge Ending
No Minutiae Point
Bifurcation
Table 4: Properties of crossing number [11]
(a) CN=1
(b) CN=2
(c) CN=3
Ridge Ending No Minutiae Point Bifurcation
Figure 28: Example of crossing number properties [29]
All the preprocessing stages including normalization, image enhancement, filtering, and
thinning do not guarantee 100% heal fingerprint identification. False minutiae such as
breaking ridges, spurious ridges and holes, can be introduced due to blurry noisy images
and image manipulation through thinning process [30]. So, after minutiae are extracted it
is necessary to implement postprocessing algorithm to eliminate false minutiae.
30
(a) Spur
(b) Hole
(c) Triangle
(d) Spike
Figure 29: Example of false minutiae structure [11]
As shown in the figure 29, spur will create false ridge ending, hole and triangle both will
create false bifurcation and spike will create both false ridge ending and bifurcation.
Minutiae point can be validated using postprocessing algorithm proposed by Tico and
Kuosmanen [31]. Similar to other techniques, this algorithm operates on thinned
(skeleton) image. This postprocessing is not implemented as a part of this project. To
determine the actual position and location apart from x and y coordinate of minutiae
point, it requires to finding angles of minutiae with respect to origin (0,0) pixel.
(a) Ridge Ending
(b) Bifurcation
Figure 30: Angle shown for minutiae point [32]
Where X0 and Y0 are coordinates for Minutiae point and Ө is the angle.
31
As shown in the figure 30, it is possible to find an angle for a known x and y coordinate
of minutiae point. The equations to convert a Cartesian coordinates to Polar Coordinates
to determine the angle are as follows:
r


2
 2
 x  y   Pythagoras  Theorem



 a tan 


y 
Tangent  Function
x 
(3.8.2)
Figure 31: Polar coordinates equation [33]
Where (x, y) are Cartesian Coordinates and (r, Ө) are Polar Coordinates.
As mentioned earlier Minutiae Extraction is applied on a thinned image. The image used
for minutiae extraction is shown below:
Figure 32: Thinned bitmap image for minutiae extraction
32
3.9 Experimental Result:
Part of the Minutiae Extraction experimental result is shown in table 5. First column in
the table 2 represents minutiae points whether it is “ridge ending (1)” or “bifurcation (3)”.
The “angle” is calculated using the Polar coordinates equation as shown in figure 31,
with respect to (0,0) origin. The “X” and “Y” are respectively Cartesian coordinates for a
minutiae point.
Crossing Number
3
1
3
3
1
3
3
1
3
3
1
3
3
1
3
3
3
1
3
3
3
3
3
1
Angle
270
225
67
135
225
292
45
315
67
270
90
292
292
22
292
67
67
202
67
67
45
202
67
292
X
113
114
135
114
116
115
127
110
124
125
115
125
110
108
110
69
108
128
128
54
54
55
130
123
Y
86
89
90
92
92
93
94
99
102
102
103
103
104
105
105
106
106
121
122
124
125
125
125
126
Table 5: Experimental result of minutiae extraction
33
Chapter 4
LIMITATIONS
Cyrix MediaGX based Embedded System and Fingerprint Identification application have
some limitations.
In terms of hardware limitation, this system has only 64 Megabytes of Dynamic
Random Access Memory. Any image manipulation requires more than 64 Megabytes of
dynamic memory access will end up in memory out of allocation error. As a result,
implementation requires complex calculation to determine if the dynamic memory access
falls into the range of the limited memory. Both the system and the hard drive require
twelve volts, five volts and ground source connection. A parallel power supply splitting
from same source will not have enough voltage to power up both devices. To resolve this
issue, two separate sources of power supply are provided to power up the system board
and hard drive.
In terms of software limitation, the application will only accept images with
image height and width less than or equal to 128. This limitation is directly related to the
hardware limitation. As this system can only handle up to 64 megabytes of dynamic
memory, any image greater than the size mentioned above will violate the rule. The
different image enhancement and manipulation operations might result in losing
significant segment of fingerprint data. In addition, the false minutiae points were not
totally filtered at the last phase of minutiae extraction. These constraints falls into
consideration while collecting the minutiae points for a given image. As a result, the
34
fingerprint matching will provide 70-80 percent accurate result for a given collected
Minutiae point.
35
Chapter 5
CONCLUSIONS
The primary goal of this project is to identify, implement and illustrate an application for
a specific embedded system that can be utilized in terms of the security of system
information. Today, for securing information, many portable devices are equipped with
latest biometrics identification tools, and Biometric enabled intelligence tools have
quickly become accepted as immediate solution of identity problems. Examples include,
but are not limited to face recognition, fingerprint recognition, DNA, hand and palm
geometry and iris recognitions. If any of these identification processes fail to verify an
individual, then it will be quite impossible to access any personal or private information
from the system. There are still lots of provisions in improving biometrics to achieve
accuracy in terms of indentifying exact individual. Improving identification algorithm,
having clearer input data and improving many other constraints will provide high
successful ratio for achieving desired results.
In this project, I have successfully implemented fingerprint identification process
to secure the personal information of a system. In addition, I have presented algorithms to
improve quality of scanned image so that accurate data can be collected for identification.
Next, a matching technique can be implemented utilizing valuable information provided
from this project and enhance the feature of algorithm implemented for more accurate
result.
36
BIBLIOGRAPHY
[1] Policy PCTechGuide, “Cyrix MediaGX”, 1998 - 2009. [Online]. Available:
http://www.pctechguide.com/24Cyrix_MediaGX.htm. [Accessed: Mar. 24, 2010].
[2] Cyrix Corporation, “MediaGX™ Architectural System Overview”, Richardson, TX,
1995-1997.[Online].
Available:http://alag3.mfa.kfki.hu/images/CYRIX/GXOVERVW.HTM. [Accessed: Mar.
24, 2010].
[3] G. Linley, "MediaGX Targets Low-Cost PCs", Microprocessor Report, Mar. 1997.
[4] J. Hruska, “Microsoft puts Windows 3.11 for Workgroups out to pasture”, Nov. 1998.
[Online].
Available:
http://arstechnica.com/hardware/news/2008/11/microsoft-puts-
windows-3-11-for-workgroups-out-to-pasture.ars. [Accessed: Mar. 24, 2010].
[5] V. L. Lorenzo, P. H. Pellitero, J. I. Martínez Torre, J. C. Villar, "Fingerprint Minutiae
Extraction Based On FPGA and MatLab" University project, Rey Juan Carlos University,
2005.
[6] S. M. Ali, M. S. Al-Zawary, "A New Fast Automatic Technique for Fingerprints
Recognition and Identification", Journal of Islamic Academy of Sciences, vol. 10, no. 2,
pp. 55-66, 1997.
[7]
M.
Reddy,
“Bitmap-File
Structures”,
2004-2010.
[Online].
Available:
http://www.martinreddy.net/gfx/2d/BMP.txt. [Accessed: Mar. 24, 2010].
[8]
P.
Bourke,
“BMP
Image
Format”,
Jul
1998.
[Online].
Available:
http://local.wasp.uwa.edu.au/~pbourke/dataformats/bmp/. [Accessed: Mar. 24, 2010].
37
[9] T. Gruber Software, Inc., “BMP (Windows) Header Format”, 2001. [Online].
Available: http://www.fastgraph.com/help/bmp_header_format.html. [Accessed: Mar. 24,
2010].
[10] L. Hong, Y. Wan, and A. Jain, "Fingerprint image enhancement: Algorithm and
performance evaluation", IEEE Trans. Pattern Analysis and Machine Intell., vol. 20, no.
8, pp. 777--789, Aug. 1998.
[11] R. Thai, "Fingerprint Image Enhancement and Minutiae Extraction" Honours
project, The University of Western Australia, 2003.
[12] R. Fisher, S. Perkins, A. Walker, E. Wolfart, “Intensity Histogram - Hypermedia
Image
Processing
Reference”,
2000.
[Online].
Available:
http://homepages.inf.ed.ac.uk/rbf/HIPR2/histgram.htm. [Accessed: Mar. 24, 2010].
[13] G.H. Hardy, J.E. Littlewood, G. Pólya, "Inequalities (2nd ed.)", Cambridge
University Press, 1988.
[14] M. Loeve, "Probability Theory", Graduate Texts in Mathematics, Springer-Verlag,
4th edition, Vol. 45, p. 12, 1977.
[15]
B.
Green,
“Edge
Detection
Tutorial”,
2002.
[Online].
Available:
http://www.pages.drexel.edu/~weg22/edge.html. [Accessed: Mar. 24, 2010].
[16] R. Fisher, S. Perkins, A. Walker, E. Wolfart, “Laplacian/Laplacian of Gaussian Hypermedia
Image
Processing
Reference”,
2000.
[Online].
http://homepages.inf.ed.ac.uk/rbf/HIPR2/log.htm. [Accessed: Mar. 24, 2010].
Available:
38
[17] R. Wang, “Laplacian of Gaussian (LoG)”, Sep, 2009. [Online]. Available:
http://fourier.eng.hmc.edu/e161/lectures/gradient/node10.html.
[Accessed:
Mar.
24,
2010].
[18] E. Davies, "Machine Vision: Theory, Algorithms and Practicalities", Academic
Press, Chap. 4, 2003.
[19] R. Fisher, S. Perkins, A. Walker, E. Wolfart, “Thresholding”, 2000. [Online].
Available: http://homepages.inf.ed.ac.uk/rbf/HIPR2/threshld.htm. [Accessed: Mar. 24,
2010].
[20] R. Gonzalez, R. Woods, "Digital Image Processing", Addison-Wesley Publishing
Company, pp 518 - 548, 1992.
[21] S. Ahmed, M. Sharmin, C. M. Rahman, "A Generic Thinning Algorithm with Better
Performance", Thesis Report, Bangladesh University of Engineering and Technology,
Dhaka.
[22] V. Ubeda, "A Parallel Thinning Algorithm Using KxK Masks", JPRAI(7) , pp.
1183-1202, 1993.
[23] Z. hang, Y.Y. Wang, "A New ParallelThinning Methodology", IJPRAI(8), pp.9991011, 1994.
[24] M. Yin, S. Narita, "Speedup Method for Real-Time Thinning Algorithm", Digital
Image Computing Techniques and Applications, Jan. 2002
[25] D. Azar, "Hilditch's Algorithm for Skeletonization", Pattern Recognition course,
McGill University, 1997.
39
[26] J. C. Amengual, A. Juan, J. C. Prez, F. Prat, S. Sez, J. M. Vilar, "Real-time
minutiae extraction in fingerprint images", Proc. of the 6th Int. Conf. on Image
Processing and its Applications, pp. 871–875, Jul. 1997.
[27] B. M. Mehtre, "Fingerprint image analysis for automatic identification",
Machine Vision and Applications, Vol. 6, no. 2, pp. 124–139, 1993.
[28] S. Kasaei, M. D., B. Boashash, "Fingerprint feature extraction using block-direction
on reconstructed images" , IEEE Region 10 Conf., digital signal Processing applications,
TENCON, pp. 303–306, Dec. 1997.
[29] F.A. Afsar, M. Arif, M. Hussain, "Fingerprint Identification and Verification System
using Minutiae Matching", National Conference on Emerging Technologies, 2004.
[30] Q. Xiao, H. Raafat, "Fingerprint image postprocessing: a combined statistical and
structural approach" Pattern Recognition, Vol. 24, no. 10, pp. 985–992, 1991.
[31] M. Tico, P. Kuosmanen, "An algorithm for fingerprint image postprocessing" In
Proceedings of the Thirty-Fourth Asilomar Conference on Signals, Systems and
Computers, vol. 2, pp. 1735–1739, Nov. 2000.
[32] M. Kaur, M. Singh, A. Girdhar, P. S. Sandhu, "Fingerprint Verification System
using Minutiae Extraction Technique", World Academy of Science, Engineering and
Technology, Vol. 46, 2008.
[33] “Polar and Cartesian Coordinates”, [Online]. Available:
http://www.mathsisfun.com/polar-cartesian-coordinates.html. [Accessed: Mar. 24, 2010].
Download