CHAPTER 1 INTRODUCTION 1.1

advertisement
CHAPTER 1
INTRODUCTION
1.1
Biometric Encryption (BE)
In the new millennium, electronic devices are becoming more integrated into
human life in such a way that, today most daily functions incorporate the use of
digital gadgets that are necessarily compact, efficient,and real-time. Resources are
readily available in an open environment such that identity-based security measures
have to be incorporated to avoid misuse. Hence information security and trust
environment studies are research fields that will see a great number of advances.
Current information security measures seen in everyday life include methods such as
computer passwords, Automated Teller Machine (ATM) transactions, mobile phone
access code, home and office security systems.
Security measures can be divided into three broad categories namely
knowledge-based, token-based and biometric-based systems. Knowledge-based
security grants access to user based on specific knowledge that should be known only
to the legitimate user such as a password or Personal Identification Number (PIN).
Token-based security systems require the use of a token such as a key, smart-card,
RFID chip or magnetic card to unlock the system. Biometric-based systems require
the user to provide some physical or physiological proof of identity such as
fingerprint, face, voice or signature to access the knowledge, place or resources that
are protected by the security system. Today’s high security systems mainly use
cryptography, biometric authentication and data hiding techniques for data, resources
and access security [1, 2].
Cryptography is the process of using a cipher for encrypting data to become
scrambled data for security purposes and decrypting the scrambled data to regain the
original data when required. Encryption is performed using cryptographic schemes
such as AES, DES, RSA, ECC etc. A cryptographic key (cipher) must be used to
perform the encryption/decryption process. This cryptographic key is usually a string
of hexadecimal values. Security schemes that implement modern cryptographic
algorithms have been proven to be highly secure and in some cases unbreakable even
by brute force attacks [2]. However, cryptographic security systems do have a weak
point which is the cryptographic key storage [3]. Since cryptographic keys are a long
and random string of digits, they cannot be memorized by user and hence are stored
on smart-cards, hardware token or in a database to be retrieved by giving the right
password or PIN. Since cards and tokens can be stolen and passwords can be guessed
or subjected to dictionary attacks, cryptographic security systems remain vulnerable.
Biometric authentication systems are security schemes that have a database of
secure templates made of user biometrics and grant access to users upon verifying
that a live biometric sample matches the template stored in the database. Biometric
systems must be able to handle small variations between the biometric sample and
the templates in the database. This is because biometrics can change over time due to
age, injuries and health conditions as well as slight variations in the biometric
capture conditions such as angle, lighting and sensitivity/accuracy of the biometric
device. While biometric templates are truly unique for each user and cannot be stolen
or guessed, biometric systems do not have the secure process structure that
cryptographic schemes do and hence are vulnerable to tampering and attacks at
various stages of the authentication process [4]. There could be far-reaching
consequences if a user’s biometric data was compromised. This could cause
permanent inconvenience and problems for the person since biometric data is
irrevocable and cannot be changed like passwords.
Since both the above mentioned security schemes have well-known
weaknesses that have been exploited in the past, this provides the rationale to
implement a stronger security scheme. Biometric encryption provides a solution for
the above mentioned shortcomings by using biometrics as the key of the security
system but overcoming the shortcomings of the biometric authentication system’s
architecture.
Biometric encryption (BE), also known as biometric cryptosystem or biocryptosystem, is a data security scheme which merges biometric data and secret
information to secure the secret information in a “fuzzy vault”. The secret
information (referred to as ‘secret’ from hereon) is critical data that needs to be
protected or hidden, such as a password to another layer of security scheme, a
cryptographic key etc. The secret can only be accessed from the fuzzy vault by an
authorized user by presenting a valid biometric. Biometric traits that can be used in
BE systems include fingerprints [5-7], iris [8], face [9, 10] and palm print [11].
To extract biometric data from a given biometric sample such as fingerprint
or finger-vein image, the input must be processed using various image processing
algorithms such as data type conversion, noise elimination, binarization,
normalization etc. These processes are carried out in a subsystem known as the
biometric extraction system. This subsystem is used in all generic biometric
authentication systems before the biometric matching module gives the final result.
However, the biometric extraction subsystem and all the image processing blocks are
outside the scope of this thesis.
The main challenge of the BE system is how to bridge the fuzziness of
biometric matching and the exactness required by key-based cryptosystems. This can
be achieved through the fuzzy vault scheme, much work of which has been reported
in [3,5-6, 11]. The fuzzy vault scheme is one of the methods to implement BE. In this
thesis, the proposed fuzzy vault-based BE system is based on the work by
Nandakumar et al. [6]. For convenience, in this thesis, the system created based on
the fuzzy vault scheme will referred to as the fuzzy vault biometric encryption
system, FVBE for short. FVBE implements a mechanism that binds the crypto key to
the user’s biometrics unequivocally [5]. The word ‘fuzzy’ here refers to the varied
and inaccurate nature of biometric data because this data usually has some variance
at each capture and also consists of noise. The ‘vault’ is a metaphor for the secure
template that comes after the combination of biometric data and secret information.
The secret does not exist separately within the fuzzy vault and hence cannot be
extracted in any illegitimate way. The secret can only be reproduced when an almost
exact copy of the biometrics is presented to the system. Figure 1.1 shows the
conceptual block diagram of a fuzzy vault-based biometric encryption system, with
(a) enrollment/encoding and (b) verification/decoding processes.
(a)
Enrollment/Encoding Process
(b)
Verification/Decoding Process
Figure 1.1
Biometric Encryption – Conceptual Block Diagram
Biometric encryption systems are suitable for stand-alone security and
authentication devices with commercial applications such as bank ATMs, office
security administration, employee attendance and records, e-commerce, home
security systems, national medical database, citizen identity (ID) record, international
travel control, crime control as well as location access control such as concert halls
and stadiums. It is essential that these applications incorporate identity-based security
and advanced trust systems to be adequately reliable and secure. On top of that, the
underlying electronic support of these applications should be embedded, high
performance and real-time. Therefore, it is prudent to implement the biometric
encryption and other associated security functions in a System-on-Chip (SoC)
design.
An exemplary application for biometric encryption is the Biometric ATM
that was developed as an SoC prototype in this research. This type of application
combines three subsystems that consist of biometric image pre-processing, fuzzy
vault scheme and AES encryption as shown in Figure 1.2 below.
Figure 1.2
Biometric Encryption Application – Biometric ATM
1.2
System-Level Design and System-on-Chip
1.2.1
Digital Design Methodology – from RTL to ESL
The traditional method (RTL based design methodology), for designing
digital systems consists of a common design specification phase followed by separate
design and verification phases for the software and hardware parts of the design
followed by the system integration and testing phase and finally implementation and
deployment. As the size and complexity of systems increase, the conventional design
practice as illustrated in Figure 1.3 is no longer adequate to produce efficient and
timely results. Today’s system design requires extensive hardware verification, better
architectural analysis to select the most optimum one and a complete set of tightly
integrated software, tested with the hardware to ensure proper functionality.
Complex systems’ design can no longer be carried out efficiently through behavioral
synthesis at the Gate Logic or Register Transfer Level (RTL). The current trend is to
raise the design abstraction level from RTL to Electronic System Level (ESL).
Figure 1.4 depicts the ESL design methodology. System level design is different
from behavioral synthesis since behavioral synthesis is the process of mapping an
algorithm to a hardware device whereas system level design is a high abstraction
level design of a complete system architecture including the hardware, software and
the communication interface [13].
ESL design is the process of describing hardware functionality at the higher
level of abstraction. The primary motivation for design at higher abstraction level
Figure 1.3 Traditional Design Methodology
Figure 1.4 ESL Design Methodology
comes from the increasing complexity of modern system design as well as the
demands of productivity in terms of shorter time-to-market. System level models
such as mathematical model, algorithm, state chart, class diagram, schematics
describe the system functionality in minimal terms. Around 80% of the design
decisions are taken in this phase of the project which takes around 20% of the total
design time [14]. Design constraints such as programming abstraction, runtime
environments, timing, performance, cost, power, area, etc. must be taken into
account at this stage.
There are some clear advantages in using ESL design methodology. Design at
higher levels of abstraction enables earlier verification. The designer is able to
efficiently design complex systems in a unified hardware/software design
environment. Early design space exploration enables the designer to consider
multiple architectures for the system and choose the best one. The optimized
power/performance trade-off can be found by selecting the best hardware-software
partitioning option. Early functional verification performed at higher abstraction
level can significantly cut down the design cycle and prevent re-design requirement
[15]. ESL increases the degree of automation for design verification and
implementation by facilitating design reuse and hence reduces the effort overhead
[13].
1.2.2
System Level Verification and Testbench Generation
In ESL design methodology, the design team begins by creating design
models of the system based on the system specification and requirements. The design
model is refined repeatedly from the higher to lower abstraction levels. At each
design level, the correctness of the refinement process and the resulting models are
verified thoroughly. This verification is crucial to avoid costly mistakes and long redesign cycles. A survey of the global IC Design industry found that functional
verification is the most time-consuming activity in the design cycle as shown in
Figure 1.5 below [16].
Figure 1.5 Percentage of Time Taken for Design Cycle Activities
(Adapted from [16])
The traditional approach to design verification consisted of a common
specification phase for both hardware and software portions of a system. Then, after
a small delay, the development of hardware and software was performed by separate
design teams. The final design phase consisted of system integration. This approach
caused a long design cycle with full verification performed only after the hardware
prototype is available which makes it too late to correct errors and make
modifications. The ESL approach is to implement hardware-software co-verification.
With this method hardware and software development can be performed
simultaneously as early as the stage after the system specification phase. Hardwaresoftware integration can also be performed much earlier which helps in detecting
interface problems and make the required modifications. Hardware-software coverification is performed by using tools such as simulation and emulation-based
platforms [17].
But, there is a problem in simulation-based design verification that needs to
be addressed – that is, in the generation of the testbenches. An effective system
design and verification requires that the verification vectors be reusable, that is, the
same testbench should be refined and used at lower levels of design abstraction.
Currently, there is no industry standard for design abstraction levels and the tools and
languages used in the process. This has resulted in various languages and
design/verification platforms to be created which have their own advantages and
disadvantages. However, the use of the multiple tools and languages causes each
design level to be isolated from others. Therefore, the testbenches generated at one
design level have to be re-created from scratch in a different language or targeted for
a different tool or platform at the next design level. This substantially increases the
design cycle period and reduces coherence between design models and testbenches at
different design abstraction levels.
1.2.3
System-on-Chip (SoC)
A System-on-Chip (SoC), as shown in Figure 1.6, is an integrated circuit (IC)
that contains a single monolithic substrate chip, an embedded processor, bus,
memory, I/O interface unit, and dedicated hardware blocks such as Digital Signal
Processors (DSP), accelerators, timers, Phase Locked Loop (PLL) etc and their
interconnections as well as the software required to drive them.
SoCs can be considered the future of microcontrollers and systems-in-apackage that consist of multiple chips. SoCs are used in most digital devices because
they contain the same basic blocks but with higher compute capacity and memory
such that they can perform complex functions involving digital or analog signals.
The advantages of using SoC include reduced size of device, lower cost, lower power
consumption and increased performance. SoC implementation in embedded systems
can lead to high performance in complex embedded tasks such as data security,
signal processing and network communication applications.
&&$
!""#$%
$'
Figure 1.6 System-on-Chip architecture
A typical SoC design involves a complex system architecture which consists
of multiple processing cores, memory and communication blocks. The SoC is made
up of multiple sub-systems, where each sub-system may contain multiple Intellectual
Property (IP) blocks, and each IP block in turn is made of multiple computational
modules. Therefore, SoCs cannot be designed and verified with conventional
methods, and so they require the application of state-of-the-art system-level design
methodologies. SoC design requires hardware/software co-design and early design
verification to ensure that the hardware and software are compatible and the design
intent has been achieved.
1.3
Problem Statement
Biometric encryption is based on compute-intensive algorithms, and hence its
design and implementation into an SoC is a highly complex task. The fingerprintbased fuzzy vault scheme proposed by Nandakumar et al. in [6] has a well-defined
and robust architecture but is not suitable for implementation on SoC due to its
compute-intensive processes. The complete system behavior and all the associated
algorithms must be studied thoroughly. The implementation of any design on SoC
requires it to function in a resource-constrained and real-time environment. Hence,
the initial stage of an SoC design would involve the optimization of the underlying
process blocks and replacing slow algorithms with more efficient algorithms. The
system’s computational blocks need to be enhanced for performance and the interprocess communication need to be efficient and cost-effective. Therefore, the
biometric encryption system needs to be analyzed to enhance existing algorithms or
propose new and faster algorithms where appropriate.This will result in a new FVBE
architecture that is targeted for SoC implementation.
The chaff generation algorithm is one of the most compute-intensive one in
the biometric encryption system. It is critical for the system because the number of
chaff points generated affects the security of the scheme, where more chaff points
added to the template increase the infeasibility of polynomial reconstruction and thus
protect the system from being compromised by unauthorized parties. Previously,
almost all fuzzy vault scheme implementations were in software on workstations
which have high computing power. Therefore, the expense of a compute-intensive
process in terms of time, memory and logic cost was not considered. However, for
the implementation of the fuzzy vault scheme in an SoC environment, the existing
chaff generation algorithm is found most unsuitable. This has led to the key
contribution of this thesis which is the proposal of a new, faster chaff generation
algorithm.
The biometric encryption system is made up of multiple sub-systems that
perform varied and complex computations. These sub-system blocks should be
modeled and tested during early phase of the design process. To facilitate this
process, the system should be modeled at higher levels of abstraction. Then, the
blocks are refined systematically through lower levels of abstraction until the final
implementation model is created. The challenge of designing a complex SoC lies in
the efficient division of the computation blocks between hardware and software
implementation.
Finding the optimal architecture involves finding the right balance in terms of
hardware and software partitioning according to the targeted application.
The
function blocks of the system must be implemented in hardware or software
depending on design factors such as computational requirements for multiple
complex algorithms, communication between the various sub-modules and frequency
of memory access. The optimal architecture would implement appropriate system
blocks in hardware and software as well as have optimized communication interface
between the two partitions. This is made possible by performing hardware-software
(hw-sw) co-design and co-verification. Consequently, system level modeling of the
biometric encryption SoC is required. The process of system level modeling involves
design at System Specification, Untimed Functional (UTF) and Cycle-accurate
Timed Functional (CTF) levels and allows multiple options for hardware-software
partitioning to be considered. In this way, the most optimum system architecture can
be chosen to successfully meet the design specifications and resource constraints.
UML documentation is used to record and monitor the progress of the design cycle
and the design is modeled using an industry-standard modeling language called
SystemC. The final output of this design cycle is the RTL implementable model of
the design.
At each level of abstraction the blocks should be verified via appropriate
testbenches to ensure correct functionality according to design specifications. The
inter-communication and synchronization between the computation blocks should
also be tested at each abstraction level by using the same testbenches. Therefore, a
key requirement of the verification process would be testbench generation to create
appropriate testbenches for each computation module or algorithm in the design.
System-level verification should be performed at each abstraction level to ensure that
the integrated blocks are working correctly. This verification process requires the
creation and implementation of testbenches for each part of the design. Recreating
testbenches for the same blocks or systems at each design abstraction level would be
both time-consuming and wasteful. The best practice would be to take higher level
testbenches and refine and reuse them at lower levels of abstraction to continue the
verification process. A good verification framework would fulfill two main criteria
[18]:
•
It allows early system verification
•
It facilitates the re-use of testbench and test vectors throughout the design cycle
1.4
Objective
The goal of this research is to design a biometric encryption (BE) system for
implementation in an SoC, by applying system-level design methodology based on
SystemC. For an effective SoC implementation, the BE algorithmic blocks are
developed such that compute-intensive algorithms are modified and enhanced for
high performance and made suitable for hardware acceleration. Thus the objectives
of this thesis are:
1. To propose a biometric encryption system architecture based on a fuzzy vault
scheme, that is suitable for implementation in an SoC. This includes the
proposal of a new, fast chaff generation (CG) algorithm that is less computeintensive than the existing CG algorithm.
2. To apply system-level design methodology in the development of an FVBE
system for implementation in an SoC. The FVBE SoC is first modeled at the
algorithmic level then refined iteratively through the system-level abstraction
models, to generate System Specification, Untimed Functional (UTF) and
Cycle-accurate Timed Functional (CTF) models in SystemC.
3. To propose a methodology for testbench generation that facilitates system
level verification.This methodology allows the creation of testbenches that
can be refined and reused at lower levels of design abstraction. The
testbenches are first built at the algorithmic level in MATLAB and after
being used to verify the algorithmic design model, they are then refined for
use with the design model in lower levels by interfacing the SystemC design
model with the MATLAB testbench in this verification framework.
1.5
Scope of Work
The scope of work of this research includes:
•
The proposed biometric encryption system applies the fuzzy vault scheme
based on the work by Nandakumar et al. [6]. It combines finger-vein
biometrics and AES crypto key as the secret to be hidden in the fuzzy vault.
•
The CODESL co-simulation platform [19] is used to model the system at
higher levels of abstraction.
•
The finger-vein biometric feature extraction subsystem is not in the scope of
this work. The applied minutiae template is derived from the biometric
system developed in [20].
•
The biometric encryption system is designed, refined and verified at
algorithmic, system specification, untimed functional, cycle-accurate timed
functional and register transfer levels.
•
The target platform is Altera’s Nios II Dev. Kit with Stratix II processor and
100 MHz clock.
•
The following are tools utilized in this research including Altera Quartus,
Nios II IDE, Eclipse, Glade and the industry standard computer languages of
UML, VHDL, C, C++ and SystemC.
1.6
Research Contributions
The proposed fuzzy vault-based biometric encryption system (FVBE)
provides a double-strength security scheme that overcomes the inherent weaknesses
of the existing data security systems. The implementation of FVBE in an SoC makes
it ideal and relevant for today’s commercial applications due to its real-time and
embedded nature. The design of the SoC uses system level modeling which is more
efficient than traditional RTL design methodology for complex digital systems.
Hardware-software co-design and design space exploration find the optimal
hardware-software partitioning and resource-utilization to enable the system to
function well in a real-time environment.
System level verification makes it possible to detect design flaws earlier in
the design cycle which saves a considerable amount of time and resources spent for
redesign or correction as opposed to traditional verification late in the design cycle.
Verification detects inconsistencies between design models on different abstraction
levels and prevents divergence from the original design intent and specification. The
proposed verification framework facilitates block-level and system-level verification
in addition to multi-level verification that ensures design correctness throughout the
design process on the various abstraction levels. In summary, the main contributions
of this research work are:
1. A biometric encryption system that consists of new and modified process
algorithms that reduce the computational requirements so that it can be
implemented as an SoC.
2. A systematic design of the FVBE using system level modeling approach. The
CODESL platform is enhanced by adding a top level that allows algorithmic
modeling of the design and the FVBE was modeled and refined through five
distinct design abstraction levels in order to find the best architecture.
3. A verification framework for a complex digital system design is proposed.
The framework is responsible for testbench generation and refinement
through all the design abstraction levels. The FVBE is tested using this
framework. After the process blocks in the FVBE are tested individually at
each abstraction level using the testbenches, the complete FVBE is verified
using the system verification approach.
1.7
Organization of the Thesis
This thesis consists of a total of six chapters. The first chapter was an
introduction and overview of the thesis work. The second chapter contains the
literature review for this research. The third chapter discusses the biometric
encryption system and the enhancements made to the process block to enable SoC
implementation. The fourth chapter describes the system level modeling process
using the CODESL co-simulation platform and proposes a verification methodology
that is used alongside the co-simulation platform for early design verification and
correction purposes. The fifth chapter presents the results and analysis for the
experimental works carried out throughout the research. The final chapter
summarizes the PhD work and contains recommendations for future work.
Download