Privacy in the intelligent ambience and network complexity

advertisement
Privacy in the intelligent
ambience and network
complexity
Miltos Anagnostou
School of Electrical and Computer Engineering
National Technical University of Athens
The objective of this talk
• The need for privacy protection is likely to
drastically shape future networks and
services and to increase their complexity.
• Otherwise, the general population will
most probably not accept ambient
intelligence in every day life.
2
Basic Privacy: The lamp example (I)
• Assumptions: One house, one
room, one lamp, one human.
• What can possibly be the meaning
of switching a lamp on and off?
(Trivial answer: any digital signal.)
• Assume usual meaning: “On”
means presence, “off” means
absence (or sleep).
• What is privacy in this context? An
external observer should not be
able to detect human
presence/absence.
• Light can be seen only by nearby
observers.
3
The lamp example (II)
• Trivial solution: Close window or remove lamp
(effective, but degrades “service”).
• “Sophisticated” solution: An automatic
mechanism is used to produce an on/off pattern
in the human’s absence.
• Intruder’s attack 1: Analyse on/off pattern, see if
it is realistic.
• Intruder’s attack 2: Combine information from
different sources: “Is car parked outside?”
• Countermeasure to attack 2: Have second car
(at a cost).
4
The Dinner Example:
Bob’s view
• Alice and Bob discuss (at
15:00 hrs on the phone)
having dinner together.
• Bob calls restaurant R
(19:00), books table for 2.
• Bob picks Alice from home
(20:00).
• Bob & Alice have dinner
together at R (20:30-22:30).
5
The dinner example:
Trudy’s view
Trudy has access to call data & location update
data of Bob & Alice. Trudy observes the
following events:
• Bob has called Alice (among others) and
restaurant R.
• At 20:00 Bob and Alice are in the same cell C1.
Alice’s home is also in C1.
• At 21:30 Bob and Alice are in the same cell C2
(≠C1). Restaurant R (21:30) is in C2.
6
Countermeasure
“reduce location updates”
• Location updates in GSM are created
periodically, on power-on, on call reception.
Purpose is to reduce search time upon an
incoming call.
• Countermeasure: Cancel all periodic updates.
• Result: Location changes are not monitored
between successive calls.
• Disadvantage: Response time for incoming calls
is prolonged.
• Phone search algorithm is complicated.
7
Enter Ambient Intelligence,
alias Pervasive Computing
AmI is the vision of an
environment
• filled with smart and
communicating devices,
• which are naturally embedded
in the environment and in
common objects,
• while their presence is kept as
seamless as possible.
8
Ambient Intelligence
applications
Smart objects:
• Tags,
• clothes,
• tools.
Smart spaces:
• Smart floors
• Sound based tracking mechanisms
• Face recognition devices
Smart reasoning, management of smart devices:
• The Cooltown model
9
Mediacup project
Smart mugs are equipped with
• sensors: temperature, motion (acceleration),
touchdown (switch under the bottom),
content (weight), location
• communication and processing capability.
Objective: To draw conclusions from using
smart objects.
10
Context aware services
CA services take into account
• Location, space form, lighting, weather and other
properties of the environment
• time,
• present persons, activities, sentiments,
• preferences,
• Interaction/dependencies with/from other
services/applications.
Increase convenience while human-machine
interaction remains seamless.
11
Smart floor
• Tracking mechanism.
• Can distinguish between different persons.
Can be used for identification.
• Can receive signals.
12
The main problem
If ambient intelligence is about collecting
and communicating information, how
can privacy be protected in such a
world?
13
Location privacy
• Capability to control one’s own location
information distribution.
• Total location privacy is not desirable:
Accidents can happen to an invisible person.
• However, modern technology is changing
location resolution and intruder population
size.
• Your location privacy is likely to be
compromised each and every time you
interact with an intelligent object of known
position.
14
Public is not always willing to
accept reduced location privacy:
The RFID case
• Benetton announces intention to use RFID in
products (March 2003).
• RFID supplier Philips does not publicise after
sales deactivation method.
• Consumers suspected surveillance capability
when purchasing and when using a product.
• Consumers boycott Benetton, which retreats.
15
Location v. identity
• Some location based services can be used
without identification: “When I reach a
product I would like to know absolute and
comparative price information.”
• E-payment services usually require
authentication (although there is ongoing
research on anonymous money).
• A Heizenberg principle would be desirable.
16
Who is Trudy?
• A security/privacy study begins with an
assessment of possible threats and possible
enemies.
• In the past only spies and lovers were interested
in cryptography.
• Today there is not only industrial spying, but also
surveillance of the common human’s habits for
promotional purposes (e.g. by using cookies).
• Law enforcement and national security agencies
are increasingly favouring less privacy for more
security and safety (safe flights v. safe
communications dilemma).
17
The “program”
According to a recent (17/8/06) US court decision
• “… a secret program … at least by 2002 and
continuing today … intercepts without benefit of
warrant or other judicial approval, prior or
subsequent, the international telephone and
internet communications of numerous persons and
organizations …”
• “Defendants have publicly admitted to the following:
(1) the … program exists; (2) it operates without
warrants; (3) it targets communications …”
• Decision connects loss of privacy with economic
impact.
18
The Athens Olympics “wire”-tapping
• Publicly announced in Febr. 2006 by the Greek
government.
• Probably started in August 2004, prior to the
Olympic Games.
• Continued until March 2005, when uncovered
due to a massive SMS rejection alarm.
• Voice stream was duplicated and sent to a group
of prepaid-card mobile phones (“shadows”) by
special software residing in selected exchanges.
• Investigation is open.
19
Arguments against privacy:
Industry’s view
• Companies and organisations are
vulnerable to abuse of their services by
their clients.
• The obvious “solution” is information
collection and information exchange.
• Example: The program “Teiresias”
monitors bank related fraud. It is used in
forecasting the behaviour of persons
wishing to borrow money.
20
Arguments against privacy:
State’s view
• Organised crime and terrorism are using more
and more advanced technology.
• Law enforcement mechanisms should be able to
make use of at least equally sophisticated
technology.
• Life is more valuable and worth preserving than
privacy.
• Surveillance technologies can save time and
money.
21
Increase Complexity
@ Network Level
• Mix networks (Chaum, 1980) rely on a chain of
proxy servers. Each proxy encrypts incoming
messages using public key cryptography.
• Routing onions (Goldschlag, Reed, and
Syverson, 1996) encode routing information in a
set of encrypted layers.
• Mist routers (J. Al-Muhtadi, R. Campbell et al,
2002): Components, which know user location
are separated from components, which know
user identity. Disadvantage: No cameras, no
talking.
22
Increase Complexity
@ Application Level
• Crowds is an anonymity agent, which hides a Web
surfer's behaviour within a group of other surfers. A
request is seen to come from a subset of the total
population, which uses Crowds (Reiter, Rubin, 1999).
• Lucent Personalized Web Assistant (LPWA): creates and
manages pseudonyms. On each return to the same web
site the same pseudonym is used. Different pseudonyms
are used in different web sites (Gabber, Gibbons et al,
1999).
• Anonymous Remailers: Change source & destination IP
addresses en route. Create dummy traffic.
• Mix Zones: “MZ is a connected spatial region of
maximum size in which no user has registered any
application callback” (Beresford, Stajano, 2003).
23
Privacy @ application level
• Services/applications are not designed for privacy:
Skype’s default profile reveals user proximity to PC.
• Attack: In an AmI environment Trudy will analyse Bob’s
behaviour by combining different channels.
• Defense: Devise dummies, which will interact with
environment (possibly in different locations).
• Counter-attack: Depends on dummy’s
intelligence/complexity.
• Overhead: Effective user population will be multiplied (to
the benefit of network operators and service providers).
24
Pseudonyms
• Long term pseudonyms
– can be uncovered through surveillance and behaviour
analysis.
• Short term pseudonyms
– require a change management system.
– When system response is expected, system must know
where to send it.
– Cannot be used with preference profiles. Correlation of
profiles uncovers pseudonym.
– If location technique resolution is high, successive
pseudonyms can be correlated. (Change only in mix zones.)
– Different pseudonyms of the same user can be correlated by
using user status information.
– If profile variations are used, small variations create
correlation risk, while large variations will create operation
problems.
25
Pseudonym related open
problems
• How should user react in low privacy level
environments?
• What is a proper privacy measure?
• Dummy users: Application layer, network layer
and physical (doors must open, purchases must
be made) complexity increase.
• How is privacy influenced by location resolution
and sampling frequency?
• Scalability problems?
26
Privacy protection systems
• Geopriv: Secure computational objects are used
to protect location and other personal data and
to enforce a privacy policy.
• P3P and Appel: User decision to accept/reject a
privacy level or policy is automated.
• pawS: Can be used for description and
validation of privacy requirements. An incoming
user’s agent is given enough data to check
whether existing privacy policy is compatible
with user preferences.
27
Conclusion
• Privacy is a working assumption in today’s economy:
Reduce privacy and certain operational costs will increase.
• Ambient intelligence will not be accepted if perceived as
privacy threat.
• Privacy can be defended at network level by increasing
complexity.
• Privacy can also be protected at application level.
Protection at application level is probably more expensive.
• Privacy can be subject to “multi-channel” attacks, the only
defence being to use expensive dummies.
28
Download