slides

advertisement
Trust Reification and IoT
Roy Campbell
ICDCS 2013 Panel
“Is my toaster lying: security, privacy and trust issues in Internet of Things.”
Problems and Issues
• ABI Research >30 billion devices will be wirelessly connected to
the Internet of Things (Internet of Everything) by 2020
• Peter-Paul Verbeek (professor of philosophy of technology)
advocates viewing technology to consider it as an active agent.
• “… the intelligence community views Internet of Things as a rich
source of data,” Ackerman, We’ll spy on you through your
dishwasher, Wired 2012.
• David M. Nicol, Information Trust Institute, “in recent months,
cybersecurity has made the news on a near-daily basis… an
estimated 137.4 million cyber-attacks took place in 2012 alone,
according to an IBM report, and former Secretary of Defense Leon
Panetta has forewarned of a coming ‘cyber Pearl Harbor’.”
Vision- Turing said it right!!!
• Computers and Humans --- can one distinguish
one from another?
• Evolutionary Competition
• No such thing as a good device or a bad human
– spectrum of competing agents with differing motives
• We need a theory and practice of distributed
systems that provides us ways to reason about
the outcome of systematized intelligent agent
games
Properties of Solution
• Reification of trust: resiliency, availability,
confidentiality, privacy…
• Use of big data: monitoring ensembles formed by
agreement and empowered by collective action.
• Need to know or minimal information exchanges
• Evidence chains, policies and evaluations
• Endogenous formation of collective awareness
Issues
Trust as Discrete Events
• e.g., configuration changes, failures, audit logs,
changes beliefs, changes to risk, ….
• Hard to summarize
• Anonymization techniques
Distributed architecture
• Cannot rely on a single entity to process information
• Confidentiality of records; liability reasons
• Multiple monitoring systems interacting without a single point of
aggregation
5
Information Leaks
Naming system
• Requests for resolution reveals that an organization has control of a
resource
Requests
• The presence of a request might imply the presence of a local sequence of
events matching the policy
Number of events
• Repeating the process multiple times reveals the number of matching
events
6
Challenges and Barriers
• Optimistic and somewhat static characterizations
of history and stable societies
• Monitoring and assessment of individual and
collective risk
• The formulization and analysis of a framework for
shared distributed decision making by
autonomous agents (human or machine).
• Self-validating framework for monitoring and
reasoning
Trust*
• Trust is a mental state comprising:
• (1) expectancy – the trustor expects a specific
behavior from the trustee (such as providing valid
information or effectively performing cooperative
actions);
• (2) belief- the trustor believes that the expected
behavior occurs, based on the evidence of the
trustee’s competence, integrity, and goodwill;
• (3) willingness to take risk - the trustor is willing
to take risk for that belief.
* Huang J, Nicol D (2010) A formal-semantics-based calculus of trust. Internet Comput IEEE 14(5):
38–46.
Trust
• Confidence in or reliance on some person or
quality --- in this case trust-related event
notification
• Such events are all time and context
dependent
• Unilateral and Conditional Sharing of Events
• Reasoning about motives, events, risks, and
outcomes.
Tradeoff: Confidentiality vs Detection
Events provide knowledge about:
• network topology
• network traffic
• configurations
• installed programs
• vulnerable programs
• user behaviors
• services
• critical machines
• …
Complete confidentiality
Complete openness
Only detection of local
security concerns
Detection of global
security concerns
Can we find a tradeoff?
10
Monitoring Architecture
Service
Provider
Cloud
Provider
Monitoring
server
Monitoring
server
Cloud
Provider
Private
Infrastructure
Multi-organization event-based monitoring
• Built on top of current monitoring
architecture
• Each organization detect problems in its
infrastructure independently
Contributions:
• Minimum information sharing / needto-know in multi-organization systems
• Distributed logic reasoning algorithm
for policy compliance
• Minimal sharing obtainable for simple
policies; reduces information
exposure for more complex policies
11
Secure Two-Party Computation
Conditional Sharing
r=sharing if events a,b match the policy
• Event a known only by org A
• Event b known only by org B
Determine if the two events match without revealing them to
the other party
Garbled Circuits [Yao, 1986; Huang, 2012]
• Fast secure two-party computation
1.
2.
3.
4.
Encode each resource-based rule as a
combinatorial circuit
Event parameters as input from each organization
If result is true, the event is shared
• If not, almost no information is leaked
Repeat for each couple of private events
runsCritService
(inst0, p)
partial(inst0)
0/1
12
References
• “Limiting Data Exposure in Monitoring Multidomain Policy Conformance,” Mirko Montanari,
Jun Ho Huh, Rakesh B. Bobba and Roy H.
Campbell, Trust 2013.
• “Transforming Big Data into Collective
Awareness,” Pitt, Bourazeri, Nowak, et al,
Computer, June, 2013
• “Garbled Circuits” [Yao, 1986; Huang, 2012]
• “A formal-semantics-based calculus of trust.”
Huang J, Nicol D (2010)Internet Comput IEEE
14(5): 38–46.
Download