Security in Ad-Hoc Wireless Networks of Embedded Devices

advertisement
Security in Ad-Hoc Wireless
Networks of Embedded
Devices
Ehud Meiri
Embedded Computing Seminar
2005/6
1
Talk Outline
Introduction
Security Basics
Security in Ad-hoc Wireless Networks
Miscellaneous
2
Introduction
The Embedded Environment
Historical Perspective - Why Do
We Need Security?
3
The Embedded Environment
► Many
devices that communicate with one
another in a network
 Connections can be peer-to-peer or broadcast
 Through wires, RF, lasers, etc.
► These
devices may have
 limited battery power
 limited computational power
4
Brief History Example
► Cellphones
- Analog
 Two-Way Radios
►Authenticated
via live operator
►No
privacy
►Few attacks
 First cellphones
►Still
no privacy
►MIN/ESN pairs for authentication
 No need for a live operator to connect
 Widespread cloning attacks (roaming)
http://www.cra.org/Activities/fellows/wagner.pdf
http://en.wikipedia.org/wiki/History_of_mobile_phones
5
Brief History Example (2)
► Cellphones
– Digital
 GSM
►Good
authentication (shared secret)
►Bad cryptography, easy to break – no privacy
 Who’s to blame?
 Viruses!
►Homogenous
digital environments
 Symbian bluetooth viruses
http://en.wikipedia.org/wiki/Global_System_for_Mobile_Communications
http://wired-vig.wired.com/news/technology/0,1282,11630,00.html
http://www.eweek.com/article2/0,1759,1733176,00.asp
6
Conclusions
►A
wireless network means wireless attacks
 New challenges
 Usually impossible to detect eavesdropping
 Hard to locate attackers
► We
can classify two network mediums:
 Broadcast – Anyone can listen
 Private – Eavesdroppers require more effort to listen
than the intended audience
 Solutions turn broadcast into private or leverage
broadcast nature for attack detection
7
Conclusions (2)
► Where
would we want to enable security?
 In public embedded environments
►Cellphones
►Campuses
►Museums
 Wireless networks
►Wi-fi
soho networks
 Sometimes it’s a wasted effort
►TV
remote control
8
Security Basics
Security Criteria
Encryption
Authentication
9
Security Pragmatism
► Q:
How do you keep your embedded device
from being messed with?
 A: Turn it off.
► Sometimes
the best we can hope for is to
detect intrusions.
10
Security Criteria
► Three
main security concerns:
 Confidentiality
►Data
privacy
 Availability
►Resistance
to DOS attacks
 Authenticity
►Keeping
“foreign objects” out, data integrity
11
Encryption
►A
basic building block of security
► Public vs. Symmetric key cryptography
► Embedded devices have power constraints
 Asymmetric keys are 103-104 times slower
 Use symmetric keys (AES, IDEA)
►Can
use public key cryptography to setup secret key
 Key exchange – more on that later
 Use efficient hardware implementations
http://en.wikipedia.org/wiki/AES
http://en.wikipedia.org/wiki/Rsa
http://en.wikipedia.org/wiki/IDEA_(cipher)
12
Advanced Encryption Standard (AES)
► The
Rijndael block cipher was selected by
NIST in 2000 to be the AES
 Replacement
for DES
 Key length of
128, 192, or
256 bits, block
is 128 bits
http://www.iaik.tu-graz.ac.at/research/krypto/AES/ - list of articles
http://www.quadibloc.com/crypto/co040401.htm
http://www.iaik.tugraz.at/research/publications/2005/IEEIFSTINA2005.htm
13
Small Hardware AES-128
Implementations
► 5.4
kgates implementation (Satoh et al.,
2001)
► AES Implementation on a Grain of Sand
(Feldhofer et al., 2005)




3.4 kgates equivalent
0.25mm²
9 Mbps
“draws only a current of 3.0 µm when operated
at 100 KHz and 1.5 V”
http://www.iaik.tugraz.at/research/publications/2005/IEEIFSTINA2005.htm
14
Fast Software Implementations
► AES-128
 226 cycles/block on a P-III (Aoki & Lipmaa, 2002)
► 14464
► FastIDEA
P-III cycles for 1kb
(4-way IDEA) (Lipmaa)
 440 cycles for a 4x64 block using MMX
► Poly1035-AES
message authentication (Bernstein)
 3.1n + 780 Athlon cycles for an n-byte message
► 5361
P-III cycles for 1kb
http://www.cs.ut.ee/~lipmaa/aes/rijndael.html
http://cr.yp.to/mac/poly1305-20050329.pdf
15
Embedded Encryption
► Put
the encryption in the network device
► Wired (100Base-TX) and wireless (802.11b)
versions
 Supports WPA, WEP
 Does 256 bit AES
 Not hardware
encryption
 820-1280mW
http://www.lantronix.com/device-networking/embedded-device-servers/wiport.html
http://www.lantronix.com/device-networking/embedded-device-servers/xport.html
16
Embedded Encryption (2)
► Put
the encryption in the CPU
 VIA chips now offer a built-in security engine
►256
bit AES
►Quantum-based random number generator
►Montgomery Multiplier for accelerating Public Key
Cryptography
 Example: Eden-N Processor (smallest)
►Thermal
Design Power: 2.5W @ 533MHz
►Size: 15x15mm
http://www.via.com.tw/en/initiatives/padlock/hardware.jsp
http://www.via.com.tw/en/products/processors/eden-n/
http://en.wikipedia.org/wiki/Thermal_Design_Point, http://en.wikipedia.org/wiki/Montgomery_reduction
http://citeseer.ist.psu.edu/ravi02system.html
17
Authentication Woes
► Central
Authentication Mechanisms?
 Ad-hoc wireless networks aren’t permanent
►Not
always reachable
►Congestion around central authorities
►DOS
 Expensive to make rapid changes
►Nodes
may only connect periodically
► How
do we know we’re talking to who we
think we’re talking to?
18
The Resurrecting Duckling
► Scenario:
embedded device + controller
► We need to prevent unauthorized use
 Authenticity
► The
controller is imprinted on the device
 Like a duckling, the first controller encountered
is the controller for life.
 A secret key for symmetric key cryptography
http://citeseer.ist.psu.edu/stajano99resurrecting.html
19
The Resurrecting Duckling (2)
► Passing
control
 Kill the duckling and resurrect it (reset the
device)
 Imprint a new controller onto it
► Imprinting
wirelessly
 man-in-the-middle attack
 Solution: imprint through a physical connection
20
The Resurrecting Duckling (3)
► Example
technology: Bluetooth
 Device pairing
►By
MAC address
►Done by the user
 Discovery broadcasts
►An
attack vector for viruses
►Solution: disable responses and only talk to paired
devices
21
Ad-Hoc Wireless
Networking
Intro (AODV)
Coping with attacks in the network level:
peer-to-peer style, in the protocol, with trust
Physical & Application levels
22
Ad-Hoc Wireless Networking
► Network
is created on-the-fly
► Routes messages through intermediate
nodes
► Vulnerable to numerous attacks
 Physical layer: eavesdropping, jamming
 Network layer: attacker is a peer, a router
23
Ad-hoc On-demand Distance Vector
routing protocol (AODV)
► On-demand
path discovery
 Using broadcasts
► Protocol
builds a route using a distributed
Bellman-Ford algorithm (distance vector)
 Slow to find shortest paths
► Old
routes slowly expire from the cache
http://en.wikipedia.org/wiki/AODV
http://moment.cs.ucsb.edu/AODV/aodv.html
24
AODV Vulnerabilities
► Attacker
is a peer in the network layer
 Routing updates misbehavior
►Preventing
routes from being built or being built
efficiently
►Invalidating routes
 Packet forwarding misbehavior
►Dropping
packets
 Availability
25
Self-Organized Network Layer
Security (Yang, Meng, Lu, UCLA ‘02)
► Collective
monitoring of peers
► A node is given a token from its neighbors
 Tokens expire after a while
►Token
duration increases with each renewal
 Key is signed by peers (PK, SK pair for system)
 Polynomial secret sharing scheme (polynomial
of order k-1)
►Each
node only has part of the secret key
http://citeseer.ist.psu.edu/yang02selforganized.html
26
Self-Organized Network Layer
Security (2)
► Tokens
are revoked for misbehaving
 Blackmail attack
 “m out of N” strategy for cross-validation of
claims
►Increasing
m decreases the chances for both
detection and false detection
► Complexity
of implementation: unknown,
but regular PK is considered expensive
27
Packet Leashes (Hu, Perrig, Johnson,
CMU/RICE)
► Wormhole
attack: forward packets to
remote locations (more than 1 hop)
 Availability
► “Wormholed”
packets arrive sooner
 In AODV, two nodes may think they are near
each other
► No
need to understand the protocol to
attack
http://citeseer.ist.psu.edu/hu01packet.html
28
Packet Leashes (2)
► Geographical
Leashes
 Nodes know:
►Their
location
►Loosely synchronized clocks
►Global upper bound on node velocity
 Packets include location and timestamps
►Digitally signed
 Via a trusted entity that signs PKs
 Via other methods referenced in article
 Compute the distance bound
29
Packet Leashes (3)
► Temporal
Leashes
 Requires tightly synchronized clocks
►Up
to few µs or even 100’s of ns
►For example using GPS
 Packets contain time signature
►Also
digitally signed
 Receiver can check if a packet has traveled too
far
►Based
on the speed of light and agreed maximum
transmission distance
30
Proxy-Based Protocols (Burnside,
Clarke, Mills, Devadas, Rivest)
► Every
device has a trusted proxy
 Impoverished devices – external proxies
 Powerful devices – internal proxies
► Proxy
duties
 Enabling inter-device communication
 Access control
 Protocol translation between devices
http://citeseer.ist.psu.edu/burnside02proxybased.html
31
Proxy-Based Protocol (2)
► Proxies
use the SPKI/SDSI public key
infrastructure for ACLs
 No hierarchy of trust
 Must provide a certificate chain to prove
authorization
►For
example if access is allowed only to members of
group B, a valid certificate chain may be:
 here’s a certificate that states I’m a member of group A,
and a certificate that states that every member of A is also
a member of B
http://citeseer.ist.psu.edu/clarke01certificate.html
32
Jamming/Interference
► An
attacker may jam our network with a lot of
packets or interfere with the signal.
 Availability
► Coping
with jamming/interference attacks
 Locate the attacker by measuring LAN signal strength
► This
can also be used against us (Confidentiality)
 Attacker is generating a lot of requests – prioritize
service
 Attacker is generating noise at the physical level Spread Spectrum technology
http://citeseer.csail.mit.edu/537210.html - Only a starting point…
33
Sensor Networks
► Attacker
can contribute faulty data
 Authenticity, Reliability
►In
this context, the attacker is a “Byzantine” node
► Solution:
distributed consensus protocols
 Classic asynchronous problem impossible
(FLP83)
 Possible with digital signatures
http://theory.lcs.mit.edu/tds/papers/Lynch/jacm85.pdf
34
Miscellaneous
Sleep Deprivation Torture
Security Bugs
35
Battery Exhaustion
► “Sleep
Deprivation Torture” - DOS
 Availability
► Keep
a battery powered device busy so that
its battery runs out
► Solution: Standard DOS coping strategies
 Throttle services
 Flood protection
 Alert the supervisor
36
Buggy Software
► Software
bugs may trigger an attack
 Authenticity, Confidentiality, and Availability
► Solutions
 Standard preventive programming measures
►Unit
tests
 Other solutions proposed here (to cope with
attacks)
37
Download