Understanding Error Rates in Biometric Access Control

advertisement
Understanding Error Rates in Biometric Access Control
This document was prepared by Recognition Systems, an Ingersol-Rand company, to
provide a consolidation of the independent research featuring biometric devices used in
physical access control.
What are Biometrics:
Biometrics identify people by a unique human characteristic. The size and shape of a
hand, a fingerprint, the voice and several aspects of the eye are just some unique
attributes. “The word "biometric" simply means the measurement of a living trait,
whether physiological or behavioral. Biometric technologies compare a person's unique
characteristics against a previously enrolled image for the purpose of recognizing ...” i
Hand Geometry
Hand Geometry uses the size and shape of the hand to verify a users identity. Users enter
their unique ID either through the keypad or with a credential and present their hand for
verification. The Hand Geometry unit takes 96 measurements of the hand including
length width and height of the fingers. The measurements are converted by mathematical
algorithm into a 9-byte template that is compared with the previously captured template.
The device does not use palm or finger prints and verification time is less than a second.
Finger Print Scanning
Fingerprinting has been used in automated criminal justice processing for many years.
There are two primary approaches to capturing the image, optical sensors and siliconbased sensor that measure capacitance. The two most widely used methods for
comparing the captured image are pattern and minutia based. According to the
international Biometrics Group (IBG), “For most vendors, minutiae are located on a
fingerprint and converted into a template. The more quality minutiae are located, the
greater chance for high accuracy. Upon verification, another set of minutia is extracted,
converted to a template, then compared to the enrollment. If, for example, 31 minutiae
are extracted for enrollment, perhaps only 10 will have to match upon verification.”ii
Availability of Independent Test Data
One of the difficulties faced by end users when evaluating biometric solutions is a lack of
credible independent research upon which to base their decisions
IBG’s web site states, “Vendors commonly state False Rejection Rates (FRR), or the
likelihood that the system will not "recognize" an enrolled user's finger-scan, in the
vicinity of 0.01%…Independent testing under controlled, real-world conditions ….has
shown many of the vendors' accuracy estimates to be off by several orders of magnitude.
Testing showed some devices to have a False Reject Rate of nearly 50%, meaning that up
to half of enrolled users were not recognized by the system just weeks after
enrollment.” iii
Hand Geometry and fingerprint are the two most widely used technologies in access
control applications.. Two studies which include both Hand Geometry and fingerprint,
that mimic an access control application are included in this discussion. The more recent
study, conducted by Center for Mathematics and Scientific Computing and sponsored by
the Communications and Electronics Security Group (CESG), was published this year.
The earlier report was published in 1991 and was prepared by Sandia National
Laboratories.
Both of these studies utilize a test protocol that calls for multiple users to use a single
device. This is typically the scenario facing an access control application. Alternatively,
many tests of fingerprint devices are targeted for computer network log-on. These less
demanding applications allow a sensor to be tuned to each individual. This approach is
impractical for access control applications and therefore not included here.
Performance Characteristics
The performance of biometric devices are commonly characterized by their failure
mechanisms:
• Failure to enroll
• Failure to acquire
• False Accept Rate/False Reject Rate
• Equal Error Rate
Failure to enroll
The failure to enroll rate measures the percentage of individuals for whom a system
cannot generate a useful record. This includes:
• Those individuals who lack the biometric feature
• Those who could not provide an image of sufficient quality at enrollment
In essence, the failure to enroll rate is the percentage of the population who will be
unable to use the system.
System
Hand Geometry
Fingerprint – Silicon Chip
Fingerprint – Optical
Failure to Enroll Rate
0.0 %
1.0 %
2.0 %
Source CESGiv
Although CESG reported a 0.0% error for Hand Geometry, it is understood that there is a
small segment of the population who are without both hands. Of course, these
individuals would also have difficulties enrolling in a fingerprint system. In the instance,
where a user is without his/her right hand, the left hand may be used.
In comparing results the higher failure rates of fingerprint devices can easily be
demonstrated by viewing ones own hand. The shape and size of the hand is clearly
discernable irrespective of the lighting, temperature, moisture or dirt on the hand,. The
fine ridges and minutia used by fingerprint systems is less discernable and can be easily
obscured by contaminants. Difficulties in obtaining a “clean” image are the most
significant barriers to making repeatable and proper verification of individuals.
Failure to Acquire
The failure to acquire rate measures the percentage of attempts for enrolled individuals
for which the system is unable to capture an image of sufficient quality. Simply stated, it
is the rate at which the system fails to read a biometric trait.
System
Hand Geometry
Fingerprint – Silicon Chip
Fingerprint – Optical
Failure to Acquire Rate
0.0 %
2.8 %
0.8 %
Source CESG v
Again, Hand Geometry systems in looking at the whole hand are undisturbed by minor
imperfections. Image acquisition is a greater challenge for fingerprint systems. Low cost,
low resolution, optical fingerprint systems have issues with the fine prints associated with
some ethnicities. Silicon-based systems, because of the nature of the technology, are
further burdened by moisture content of the skin. Problems with wet-hands are not
uncommon. Electro Static Discharge (ESD), typically found in dry environments can also
be a significant cause of damage to these readers. As reported by Sandia National
Laboratories, “A number of our users appear to have poor quality fingerprints that would
not produce good results, even when other fingers were tried. Another problem was
caused by low humidity during our test period. User’s skin would dry out to the point
where the system could not verify the user.”vi
FAR/FRR
At the heart of an access control solution is the ability of the system to prohibit access to
unauthorized individuals while at the same time providing unfettered access for
authorized users.
The False Accept Rate (FAR) is the proportion of unauthorized individuals who would be
granted access by the system. The FAR assumes that whatever credential or personal
identification number (PIN) used by the system is in the possession of the unauthorized
individual. (For comparison, a card-only system would have an FAR of 100%)
The False Reject Rate (FRR) is the proportion of authorized users who will be incorrectly
denied access. One might consider this a measure of users frustration with the system.
All systems have a definable threshold that determines the acceptable match criteria.
Plotting the FAR vs. FRR through the range of settings provides a good description of the
trade-off between the two errors. Systems are configured to perform at any one point
along the curve, and will have the corresponding rates.
Notice that there exists an inverse relationship between FAR and FRR. When the
sensitivity of the system is increased to lower the FAR, the FRR must increase.
In this example, at the given system setting
Setting A
FAR = 11%
FRR = 2%
Setting B
FAR = 2.5%
FRR = 9.5%
False Reject Rate
Example
15%
10%
5%
0.%
0.%
5%
B
10%
A
15%
False Accept Rate
Implications of FAR/FRR
When evaluating the trade-offs between FAR and FRR, it is important to consider how
the error rates will affect daily operations. In facility of 100 enrollees, using a biometric
device for physical access, assuming just 2 accesses per day in a 5-day week, the system
processes 1000 request for access. An FRR of 0.5%, translates to five problems a week
whereas an FRR of 2% equates to twenty problems. The acceptable level of improperly
denied access is dependant on the application.
Similarly, the desired False Accept Rate (FAR) can be determined as a function of the
number of attacks on the security system. Ideally, a system would never grant access to
an unauthorized user. In practice, however, we know that no system is full proof and that
as we lower FAR we increase FRR, so we must determine a reasonable level of risk.
Estimate the number of attacks where the attacker has whatever credential or personal
identification number (PIN) used by the system to be 10 per year (a high estimate). An
FAR of 1% would mean that in 10 years only 1 unauthorized user would be granted
access. An FAR of 0.1% equates to just 1 unauthorized user in 100 years.
FAR/FRR Test Results
Results from the CESG testing are presented below. Curves for represent the false reject
rates after three success attempts. The three try results are most applicable to every day
use in that systems typically permit up to three successive attempts before a request for
access is denied requiring a system administrator to intercede.
False Reject Rate
Three try-error rates
15%
FP-optical
10%
FP - silicon
5%
FP-silicon (proto)
Hand Geometry
0.%
0.%
2%
4%
6%
False Accept Rate
Source CESG vii
Sandia National Laboratory also evaluated both Hand Geometry and Fingerprint Devices,
but was unable to generate curves for the fingerprint device. The test, which involved
over 10,000 attempts with the Hand Geometry device, yielded the following FRR at
factory default settings.
System
Hand Geometry
False Accept Rate
~ 0.1%
False Reject Rate
< 0.1 %
Source Sandia viii
As highlighted earlier, independent performance data is often difficult to obtain and
testing rarely spans a long period of time. Occasionally information can be gathered
from customer installations, providing an unprecedented insight into system performance
over time. One such facility, a U.S. Nuclear Power facility where Hand Geometry
readers are used reported data over a 27 week period involving 2,800 users and 250,000
transactions.
The resulting FRR over that 6 month period is shown in the following chart. The
corresponding FAR for the threshold setting used is 0.1%
% False Reject
U.S. Nuclear Power Station False Reject Data
Analysis
1
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
1
2
3
4
5
6
7
8
9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27
Weeks
False Reject Rate
ERR
The Equal Error Rate (ERR), also know as the error cross-over point provides a single
measure of the effectiveness of a system. In the previous example the FAR is plotted
against FRR. For this approach the ERR is point on the curve closest to zero.
Alternatively, one could plot both FAR and FRR curves versus the sensitivity setting.
The point where the two curves cross is the ERR, with the desired state being zero.
Below we have recharted the data of the previous example to illustrate.
Example
In this example, at the given system setting
FAR
Setting B
Setting C
FAR = 11%
FRR = 2%
FAR = 2.5%
FRR = 9.5%
FAR = 5%
FRR = 5%
Error Rate
Setting A
FRR
10%
8%
6%
4%
2%
0%
1
2
3
4
A
5
C
6
7
B
8
As evidenced in the previous section as the FAR decreases the FRR increases. The
converse is also true, as FAR decreases the FRR increases. The ERR is the lowest
obtainable FRR for any value of FAR lower than the ERR. For any Far value less than
the ERR, the FRR must be greater than the ERR.
ERR Test Results
Although no “Failure to Enroll” or “Failure to acquire” data was generated for the
prototype silicon fingerprint system, false accept and false reject curves were provide.
The associated Equal Error Rates from the test are provided below, as well as the results
from the Sandia report.
System
Hand Geometry *
Hand Geometry **
Fingerprint – Optical **
Fingerprint – Silicon **
Fingerprint – Silicon (proto) **
Equal Error Rate
0.2 %
0.5 %
6.8 %
4.6 %
1.75 %
* Source Sandia ix,
** Source CESG x
Finger Print - Optical
10%
10%
8%
8%
Error Rate
Error Rate
Hand Geometry
6%
4%
4%
2%
2%
0%
6%
35
37
39
FAR
41
43
FRR
45
0%
1
2
3
FAR
4
5
FRR
6
7
Finger Print-silicon
Finger Print - silicon
(proto)
10%
8%
Error Rate
Error Rate
10%
6%
4%
2%
0%
2
3
4
5
6
FAR
7
8
9
10
FRR
11
12
8%
6%
4%
2%
0%
1
2
3
4
FAR
5
6
7
8
9
10
FRR
Summary
As the use of biometrics grows, the need to understand the issues related to them
becomes more critical. User acceptance will always be central to successfully utilizing a
biometric. Unfortunately, there is no way that a biometric manufacturer can specify a
device’s user acceptance. Different classes of applications demand different biometric
performance in order to achieve high user acceptance. The key quantifiable performance
factors of a biometric are it’s various error rates. Therefore, understanding what these
different error rates mean and how they can impact acceptance for a given application is
critical.
Certainly, the future is bright for the biometric industry and their place in access control
applications. The goal of access control is to control where people can and cannot go.
Only a biometric device truly provides this capability to the end user.
Sources:
2001 CESG Report – Data presented above has been provided by CESG as used in their public report, the
graphs and figures shown are not as presented in the report but come from the same data set. CESG does
not intend it’s report to be an endorsement of any biometric and the public report can be found at
http://cesg.gov.uk/biometrics/
1991 Sandia Report
International Biometrics Group
i
Unforgettable biometrics - Your body is your key (just try not to lose it) By Ken Phillips for PC Week
Labs, 10.29.97 http://www.zdnet.com/eweek/reviews/1027/27bioapp.html
ii
International Biometric Group, www.finger-scan.com/finger-scan_accuracy.htm
iii
International Biometric Group, www.finger-scan.com/finger-scan_accuracy.htm
iv
Biometric Product Testing Final Report, March 2001 CESG/BWG Biometric Test Program, pg 9
v
Biometric Product Testing Final Report, March 2001 CESG/BWG Biometric Test Program, pg 9
vi
Sandia Report , SAND91-0276 * UC-906 Page 17
vii
Biometric Product Testing Final Report, March 2001 CESG/BWG Biometric Test Program, data source
viii
Sandia Report , SAND91-0276 * UC-906 Page 19
ix
Sandia Report , SAND91-0276 * UC-906 Page 19
x
Biometric Product Testing Final Report, March 2001 CESG/BWG Biometric Test Program, data source
Download