slides

advertisement
dataPrivacy Partners Ltd.
4th Annual Privacy & Security Workshop
From Anonymisation to Identification:
The Technologies of Today and
Tomorrow
Peter Hope-Tindall
Chief Privacy Architect™
dataPrivacy Partners Ltd.
pht@dataprivacy.com
November 7, 2003
Privacy by Design ®
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Agenda
• Biometrics and Privacy
• Privacy Concerns
• Design & Implementation Issues
• Technology to protect Privacy
2
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Privacy
“the right to exercise control
over your personal
information.” Ann Cavoukian
“Privacy is at the heart of liberty in
the modern state.” Alan Westin
“the right to be let alone”*
Warren & Brandeis
* Warren and Brandeis, "The Right to Privacy" 4 Harvard Law Review 193 (1890).
The phrase "right to be let alone" had been coined by Judge Cooley several
years earlier. See THOMAS M. COOLEY, COOLEY ON TORTS 29 (2d ed.
1888).
3
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Security and Privacy – a technical view
• data protection - FIPs (not FIPS)
• authentication
• data-integrity
Privacy
• confidentiality
Security
• access controls
• non-repudiation
n.b.
FIPs: Fair Information Practices
FIPS: Federal Information Processing Standards
4
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Security
•
•
•
•
5
vs.
Privacy
Accountable to
President/CEO Board of
Directors.
Risk based assessment.
(how likely is it?)
Access and use controls
defined by the system
owner.
Has been focused on
protecting against outsiders.
Biometrics Presentation
•
•
•
•
Accountable to the data
subject.
Capabilities based
assessment.
(is it possible?)
Access and use controls
defined by use limitation
and consent of data subject
and legislation.
Protecting against outsiders,
insiders and system owner.
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
The Complex nature of Privacy
•
•
•
Identity
• Measures the degree to which information is personally
identifiable.
Linkability
• Measures the degree to which data tuples or transactions
are linked to each other.
Observability
• Measures the degree to which identity or linkability may be
impacted from the use of a system. Which other data
elements are visible; implicitly or explicitly.
With thanks and apologies to the Common Criteria
6
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Biometrics
•
•
Biometric is derived from the Greek words bio (life)
and metric (the measure of).
“The automated use of Physiological or Behavioral
Characteristics to determine or verify identity”
International Biometric Group (IBG)
•
7
“‘Biometrics’ are unique, measurable characteristics
or traits of a human being for automatically
recognizing or verifying identity. ”
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Biometrics Schmetrics?
• Biometric: (noun) - one of various technologies that utilize
behavioral or physiological characteristics to determine or
verify identity. “Finger-scanning is a commonly used
biometric.” Plural form also acceptable: “Retina-scan and irisscan are eye-based biometrics."
• Biometrics: (noun) - Field relating to biometric
identification. “What is the future of biometrics?”
• Biometric: (adjective) - Of or pertaining to technologies that
utilize behavioral or physiological characteristics to determine
or verify identity. “Do you plan to use biometric identification or
older types of identification?”
8
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Biometric Template
• Distinctive encoded files derived and encoded
from the unique features of a biometric sample
• A basic element of biometric systems
• Templates, not samples, are used in
biometric matching
• Much smaller amount of data than sample
(1/100th, 1/1000th)
• Vendor specific
• Different templates are generated each time
an individual provides a biometric sample.
9
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Verification
• Also called 1:1 ‘Authentication’
• Performs comparison against a single
biometric record
• Answers question: “Am I who I say I am?”
10
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Identification
• Also called 1:N Search
• Performs comparison against entire
biometric database
• Answers question: “Who am I?”
11
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Is DNA a biometric?
•
•
•
•
12
DNA requires actual physical sample
DNA matching is not performed in real time
DNA matching does not employ templates
or feature extraction
however – Policy issues and risks are
identical
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
In a strict sense then, DNA matching is not a biometric in
the same way that traditional forensic fingerprint
examination is not a biometric.
Regardless of these distinctions, we believe that DNAbased technologies should be discussed alongside
other biometric-based technologies inasmuch as they
make use of a physiological characteristic to verify or
determine identity. Beyond the definition, to most
observers DNA looks, acts and may be used like other
biometrics. The policy ramifications, while much more
serious for DNA-based technologies share some
common attributes with other biometrics.
13
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Taxonomy
Physiological Biometrics
Behavioral Biometrics
• Finger Scanning
• Hand Geometry
• Facial Recognition
• Iris Scanning
• Retinal Scanning
• Finger Geometry
•
•
•
Voice Recognition
Dynamic Signature
Verification
Keystroke Dynamics
(In reality all biometrics are both physiological
and behavioral to some degree.)
14
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Finger Scanning
 Minutiae based or pattern based
15
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Hand Geometry
 Measures dimensions of hands
 Easy to use / Widely used in access control
applications
16
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Facial Recognition
 Based on distinctive facial features
17
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Iris Scanning
 Takes a picture of the iris.
 Performs an analysis of the
‘features’ of the iris.
Ridges
Furrows
Striations
 Scan distance - up to 1 Meter
18
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Retinal Scanning
 Utilizes distinctive patterns
visible on retina at back of eye.
19
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Finger Geometry
• Measures the shape
and size of a single
(or pair) of fingers.
20
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Voice Recognition
 Performs an analysis of
features from an audio
waveform.
21
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Dynamic Signature Verification
• Measures the
pressure, vector and
number of strokes of
signature.
• Can be used with
existing signature
applications.
22
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Keystroke Dynamics
 Measures the rhythm and
distinctive timing patterns for
keyboarding.
23
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Other
• Ear Geometry
• Body Odour
• Gait (walking pattern)
24
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Biometrics Summary
Biometric
Accuracy
Ease of
Use
User
Acceptance
Stability
Cost
Typical Applications
Suitability for
1:1
1:N
Finger-Scanning
High
High
Medium
Low
High
***
Traveler Clearance, Drivers
License, Welfare
Yes
Yes
Hand Geometry
High
High
Medium
High
Medium
High
***
Access Control, Traveler
Clearance, Day Care
Yes
No
Facial
Recognition
High[i]
Medium
High
High
Medium
Low
***
Casino, Traveler Clearance
Yes
Yes[ii]
Iris Scanning
Very High
Medium
Low
Medium
High
High
*****
Prisons, Access Control, Traveler
Clearance
Yes
Yes
Retinal
Scanning
Very High
Low
Low
High
****
Access Control, Traveler
Clearance,
Yes
Yes
Finger
Geometry
Medium
High
Medium
High
Medium
High
***
Access Control, Amusement Park
Ticket holder
Yes
No
Voice
Recognition
Medium
High
High
Medium
Low
*
Low security applications,
telephone authentication
Yes
No
Signature
Verification
Medium
High
Medium
High
Medium
Low
**
Low security applications,
Yes
No
applications with existing
‘signature’
Chart by Peter Hope-Tindall – developed for the OECD
[i] Note: Although the ‘potential’ exists for high accuracy, recent pilot projects have indicated great difficulty in obtaining accurate results with 1:N systems.
[ii] Ibid.
25
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
How does a biometric system work?
• Scanning / Collection of Sample
• Feature Extraction
• Biometric template creation
• Biometric template matching
• Many vendors have proprietary
searching subsystems and optimized
hardware
26
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Types of Function
• Identification
(1:N)
• Submission of sample as a search
candidate against entire database
• Verification (1:1)
• Validation of sample against a presumed
identity
27
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Standard Biometric System
Sensor
Logic
Reference
Database
Application
28
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
1
Biometric
Device
2
Capture and
Feature Extraction
3
Create Reference
Template or dataset
4
Store Template
in Database
Data Subject
Biometric
Verification
5
Biometric
Device
6
Capture and
Feature Extraction
7
Create Candidate
Match Template or
dataset
8
9
Business
Application
29
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Metrics
• Scientific Method / Biometric Testing
“The real purpose of the scientific method is to
make sure Nature hasn't misled you into
thinking you know something you don't
actually know.”
Robert M. Pirsig, Zen and the Art of
Motorcycle Maintenance
30
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Perceptions
•
Public perceptions
• Looking for a magic solution
•
•
31
• Feel safe technology
Post terrorism opportunism
Limited information
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Biometric Performance
•
32
•
“False Reject Rate” a.k.a. False Non-Match
Rate (FNMR)
“False Acceptance Rate” a.k.a. False Match
Rate (FMR)
•
•
“Equal Error Rate”
Biometric System Error Trade-off
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Equal error rate crossover
Error Rate
FA
FR
Sensitivity
33
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
34
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Other Metrics
•
“Failure to Acquire”
•
• Missing fingers/eyes
“Failure to Enroll”
• Insufficient features
•
•
35
May be as high as 2-4 % in
the general population. (up
to 20-30 % in elderly).
Throughput
System Cost
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Publicly Available Independent Evaluations
• CESG
• http://www.cesg.gov.uk/site/ast/index.cfm?menuSelect
ed=4&displayPage=4
• Face Recognition Vendor Test
• http://www.frvt.org
• Fingerprint Verification Competition
• http://bias.csr.unibo.it/fvc2002
• US National Biometric Test Center
• http://www.engr.sjsu.edu/biometrics/nbtccw.pdf
36
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Security Concerns related to Biometrics
• Spoofing
• Countermeasures
• Replay Attacks
• Cannot revoke a biometric
• Improper Reliance
• Insufficient Enrolment Rigour
37
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Liveness
 Steve McCurry, photographer of ‘Afghan Girl’ portrait for National
Geographic - 1984.
 National Geographic
http://www.melia.com/ngm/0204/feature0/
38
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Concerns about Biometric systems
• Rigour of enrollment process
• Lack of independent performance metrics
• No very-large population biometric system
examples
• Failure-to-enroll and Failure-to-acquire
•
•
•
39
underclass (maybe as high as 2-4% to even
20-30%)
Post terrorism opportunism
Technology panacea
Large scale biometric system failure
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Privacy Concerns
1. Function Creep
2. Infrastructure of Surveillance/Unique Identifier
•
Default method of identification
•
Used inappropriately
3. Consent/Transparency
• Information Leakage
40
•
Glaucoma
•
DNA Profiling
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Function Creep/Finality
• ‘Function Creep’ (also known as ‘purpose creep’) is the
term used to describe the expansion of a process or
system, where data collected for one specific purpose is
subsequently used for another unintended or unauthorized
purpose.
• In fair information practice terms, we may think of function
creep as the subsequent use, retention or disclosure or
data without the consent of the individual and of
unauthorized changes in the purpose specification for a
given data collection.
41
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Function Creep/Finality Example
• As an example, we may think of a social service (welfare)
system that requires a finger scan to enroll. Let us assume
that undertakings were made at enrollment to the user that
the finger scan is being collected solely for the purposes of
guarding against ‘double dipping’ (ensuring that the user is
not already registered for welfare). If the finger scan were
subsequently used for another purpose (e.g. a law
enforcement purpose, something not described in the initial
purpose specification) then we have ‘function creep’.
42
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Infrastructure of Surveillance/Unique identifier
• An overarching concern for some people is that
biometrics will become a technology of surveillance and
social control. Perhaps as the ultimate personal identifier,
they may be seen to facilitate all the ominous and
dehumanizing aspects of an information society -- a
society in which unparalleled amounts of personal
information may be collected and used on a systematic
basis.
see O’Connor, “Collected, Tagged, and Archived.”
43
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Consent/Transparency
• Certain biometrics may be used without the consent or active
participation (or indeed even the knowledge) of the individual.
• Iris scanning can already be performed at a substantial distance (a
range of 18 to 24 inches)[i] from the subject. As the technology
improves, it is quite likely that iris acquisition may take place from
even greater distances and without any user involvement
whatsoever.
• From a privacy perspective these situations can conflict with the
collection limitation, openness and purpose specification principles.
[i] http://www.eweek.com/article2/0,3959,115743,00.asp
44
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Implementation Modalities to Protect Privacy
•
•
Statutory
Policy
• Privacy Impact Assessment
• Threat Risk Assessment
• Common Criteria Scheme
•
• Standards
Technology
• Tamper proof hardware
45
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Statutory
• In some jurisdictions, generalized or specific criminal sanction may
be used to provide security protection for biometric systems and to
outlaw certain activities to bypass security controls.
• Ontario Works Act
http://www.e-laws.gov.on.ca/DBLaws/Statutes/English/97o25a_e.htm
• Biometric Identifier Privacy Act – State of New Jersey
http://www.njleg.state.nj.us/2002/Bills/A2500/2448_I1.HTM
46
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Statutory
• Statutory proscription and prohibition
• Problem; may always be modified or interpreted by the
Government of the day.
• Example: Statistics Canada 1906-1911 Census
47
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Policy
• The Privacy Impact Assessment (PIA) and privacy audits
can ensure that privacy policies are followed and to ensure
that the policies meet the needs of a given level of privacy
protection or compliance. Although these techniques are
commonplace within government, they are just starting to
appear in the private sector.
• Depends of rigour and independence of PIA process.
48
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Technology
• STEPS
- Security
Technology Enabling Privacy
• Build security systems that are privacy enabled
• Meet both Security and Privacy requirements
• Privacy Architecture
• De-Identification
• De-Linkability
• De-Observability
• Divide and conquer (similar to SIGINT)
49
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Standard Biometric System
Sensor
Logic
Database
Application
50
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Standard Biometric System
• 1:1 and 1:N Functionality
• Maximizes Control for System Owner
• No Cards to lose
• Back End Database Model
• Potential for Surveillance
• Greatest Potential for abuse
51
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Smart Card Biometric System
Sensor
Logic
Database
Application
52
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Smart Card Biometric System
• 1:1 Functionality
• Balance of Control between System Owner
and Data Subject
• Lost Card Issues
• Card Failure Issues
• Smart Card Infrastructure has Surveillance
Potential
53
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Privacy Hardened Biometric System
• Designed for Toronto CIBS Project - 1997
• Tamperproof tokens in scanner to prevent device
substitution/direct image injection
• FPGA logic to restrict use of system within preprogrammed
guidelines
• Must be a live finger on the authorized scanner
• Discourage systematic ‘dumping’ of identity database
• Keys required for identity resolution
54
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Privacy Hardened 1:N System
Sensor
Logic
Database
55
Biometrics Presentation
Identity
Resolver
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Privacy Hardened
Pseudo
Identity
56
Identity
Resolver
Real
Identity
17943568957845
Mr John Smith
73458734857384
637-759-986
53798475839753
August 31st, 1953
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Wildcard Option - Biometric Encryption
• Biometric Encryption
• Feature Extraction provides an
encryption/decryption key
• Promising techniques
• Optical Feature Extraction
• Fourier Transform of ‘visual’ plaintext
using Biometric Feature data resulting in
‘visual’ ciphertext
• One Installed site in Canada
• Needs further research to bring to market
57
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Biometric Encryption
Enrollment
Fingerprint Pattern
PIN
encrypts
Authentication
Fingerprint Pattern
58
decrypts
Biometrics Presentation
Encrypted PIN is stored
73981946
%h*9%4Kd
Encrypted PIN
PIN used for access
%h*9%4Kd
73981946
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
from: http://www.darpa.mil/iao/HID.htm
59
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Recommendations
•
•
•
•
60
Communicate openly and honestly about any planned
system.
Smaller inward looking systems.
Focus on 1:1 authentication systems instead of 1:N
identification systems
Whenever possible, develop opt-in voluntary enrollment
systems.
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Recommendations
•
•
•
•
61
Collect biometric samples openly and with the consent of the
user.
If possible, allow the user to retain custody of the biometric
template (perhaps on a smart card or token) and do not store
the biometric template in a central system.
Where 1:N systems are required craft protections in
legislation/policy and technology.
Oversight – restrictions on systems usage/identity resolution
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Hope for the future
62
•
Biometric Encryption
•
Biometric sensor on a card
•
Credential vs. Certificate (Brands)
•
Trusted Extension of Self
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Conclusions
•
63
We need to incorporate statutory, policy and
technological controls.
•
Engage the issues honestly and openly.
•
Don’t use a hammer to kill a fly.
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Our Challenge
•
•
•
•
•
•
•
64
Open discussion
The technology is not evil
Develop the best Technology
Develop the best Policy
Develop the best Statutory Protections
Raise the bar
Search for improvement
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Epilogue
•
Objectivity of privacy
• Question the appropriateness of ‘Public Acceptance’
as a measurement of anything.
• Useful at telling us what is wrong.
• Not so useful at telling us what is right.
•
Storing Template/Minutiae/Image
• Privacy Concerns are identical
65
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Resources
• OECD
•
•
•
•
•
66
• http://www.oecd.org
Information and Privacy Commission/Ontario
• http://www.ipc.on.ca
dataPrivacy Partners Ltd.
• http://www.dataprivacy.com
Roger Clarke
• http://www.anu.edu.au/people/Roger.Clarke/
Biometric Consortium - US
• http://www.biometrics.org
CATA Biometrics Group - Canada
• http://www.cata.ca/biometrics/
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
Contact Information
Peter Hope-Tindall
dataPrivacy Partners Ltd.
5744 Prairie Circle.
Mississauga, ON L5N 6B5
+1 (416) 410-0240
pht@dataprivacy.com
67
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
dataPrivacy Partners Ltd.
pht@dataprivacy.com
http://www.dataprivacy.com
68
Biometrics Presentation
© 2003 dataPrivacy Partners Ltd.
Download