Negating Trust: Intentionality and Enemy Mine Giuseppe Primiero Infometrics Workshop, Oxford

advertisement
Negating Trust: Intentionality and Enemy Mine
Giuseppe Primiero
Middlesex University, UK
Philosophy of Information and Information Processing
Infometrics Workshop, Oxford
Primiero
27 March 2014
1 / 21
The role of trust in computational domains
Trust is a crucial notion for a number of areas and applications
software management systems and web certificates
cryptography and authentication protocols
design and analysis and social networks
data analytics
reputation systems
...
Primiero
27 March 2014
2 / 21
Propagation [Mcknight and Chervany, 1996,
Ziegler and Lausen, 2004, Jøsang et al., 2006,
Jamali and Ester, 2010, Chakraborty and Karform, 2012]
In the last two decades, mainly quantitative approaches have focused on
the understanding, modelling and anticipation of trust propagation
Analyses of propagation heavily rely on the correct formal representation of
transitive trust.
Primiero
27 March 2014
3 / 21
A Problem: Transitive trust
Example
If Alice trusts Bob and Bob trusts Carol, then Alice trusts Carol.
Primiero
27 March 2014
4 / 21
What is New
Security: not only Facebook friends and Amazon books, tomorrow
cyber-allies;
Privacy: algorithmic reputation method tweaked by
personalised/contextual criteria;
Quality criteria: e.g. bias, subjective assessement, depth-boundedness,
graduality, confidence and negative values.
Primiero
27 March 2014
5 / 21
Another Problem: Trust Negation
Example (Enemy mine)
Alice does not trust Bob; Bob does not trust Carol; does Alice trust Carol?
(¬trust(A, B) ∧ ¬trust(B, C )) → trust(A, C )?
Primiero
27 March 2014
6 / 21
Forms of Negated Trust in the Social Sciences
[Cvetkovich, 1999, Earle and Cvetkovich, 1999]: distrust as a response
to lack of information
[Sztompka, 1999]: mistrust as former trust destroyed
Primiero
27 March 2014
7 / 21
Computational Situational appraoch
[Marsh and Dibben, 2005, Abdul-Rahman, 2005]
The first-order relation account of trust is situationally weakened and
extended with:
mistrust is misplaced trust;
untrust is little trust;
distrust is no trust.
It offers a quantitative (only) modelling of negative trust.
Primiero
27 March 2014
8 / 21
Intentionality required
In HCI and CPS studies the intentional requirement emerges more clearly.
E.g. in [Hoffman et al., 2009], a scale is considered
unjustified trust (antitrust) → justified trust (skeptical) →
conditional trust (contingent) → unconditional trust (faith)
Primiero
27 March 2014
9 / 21
Some Design Principles
1
typed resources: contents are characterised by their sources (Genetical
Non-neutrality Principle), [Primiero, 2007]; accountability requires
coupling with anonymization procedures;
2
authorizations: agents are qualified (and made distinct) by authorized
actions to access resources;
3
trust: a property of a first-order relation of information transmission
[Primiero and Taddeo, 2012]; it facilitates operability on previously
inaccessible resources [Primiero and Raimondi, 2014].
4
untrust: an intentionally characterized property.
Primiero
27 March 2014
10 / 21
A Trust Protocol [Primiero and Raimondi, 2014]
Primiero
27 March 2014
11 / 21
Limited transitivity
Example
If Alice trusts Bob(’s resource) and Bob trusts Carol(’s resource), Alice can
trust Carol(’s resource) if
the same resource when typed by Carol is consistent with her profile
and
it can be proven equivalent to adding a resource from the local profile
(cut).
Primiero
27 March 2014
12 / 21
Distrust characterization
Data contradicting currently held information is distrusted
It is associated with intentionally false transmission
Receiver blocks contents to induce contraries
Primiero
27 March 2014
13 / 21
A Distrust Protocol
Primiero
27 March 2014
14 / 21
Blocking transmissions
Example
If Alice receives data d from Bob contradicting information ¬d she
currently holds and believes he has sent intentionally false data, she can
distrust and block d to maintain her original ¬d (and maybe send it to
Carol).
Primiero
27 March 2014
15 / 21
Mistrust characterization
Data contradicting currently trusted information triggers a change in
trust behaviour
It is associated with unintentionally false transmission (of the
previously trusted data)
Eliminating trusted resources requires guarantors
Primiero
27 March 2014
16 / 21
A Mistrust Protocol
Primiero
27 March 2014
17 / 21
Checking information
Example
If Alice receives data d from Bob contradicting information ¬d she
currently trusts (possibly from a previous transmission) and she believes she
has unintentionally trusted false data, she can mistrust and block ¬d and
requests Carol to confirm d . If Carol confirms the data, Alice can revise
and accept the information from Bob.
Primiero
27 March 2014
18 / 21
Enemy Mine
Primiero
27 March 2014
19 / 21
Paired intentionally false transmissions
Example (Enemy mine)
Carol sends data d to Bob contradicting information ¬d he currently
holds. Bob believes she has sent intentionally false data (she is trying
to deceive him) and blocks d and sends a modified ¬d to Alice.
Alice believes he has sent intentionally false data (he is trying to
deceive her), blocks ¬d from Bob and makes d available.
Data d from Carol to Alice is trusted (assuming the transmission has
preserved contents).
Primiero
27 March 2014
20 / 21
What is new in information processing
Search for better protocols for user privacy and security: (un)trust
crucial for many applications like Access Control, Social Networks,
Software management and socio-technical systems Design
([Primiero and Raimondi, 2014, Boender et al., 2015,
Barn et al., 2015])
Requirement of qualitative assessment of epistemic properties in
algorithmic contexts
Investigation into their use to monitor and increase robustness and
resilience of computational systems ([De Florio and Primiero, 2015]).
Primiero
27 March 2014
21 / 21
References I
Abdul-Rahman, A. (2005).
A framework for decentralised trust reasoning.
PhD thesis, Department of Computer Science, University College
London.
Barn, B., Primiero, G., and Barn, R. (2015).
An approach to early evaluation of informational privacy requirements.
In ACM Conference on Applied Computing.
Boender, J., Primiero, G., and Raimondi, F. (2015).
Minimizing transitive trust threats in software management systems.
Technical report, Foundations of Computing Group, Department of
Computer Science, Middlesex University.
Primiero
27 March 2014
1/7
References II
Chakraborty, P. S. and Karform, S. (2012).
Designing trust propagation algorithms based on simple multiplicative
strategy for social networks.
Procedia Technology, 6(0):534 – 539.
<ce:title>2nd International Conference on Communication,
Computing & Security [ICCCS-2012]</ce:title>.
Cvetkovich, G. (1999).
The attribution of social trust.
In Cvetkovih, G. and Lofstedt, R., editors, Social Trust and the
Management of Risk, pages 53–61. Earthscan.
De Florio, V. and Primiero, G. (2015).
A method for trustworthiness assessment based on fidelity in cyber and
physical domains.
CoRR, abs/1502.01899.
Primiero
27 March 2014
2/7
References III
Earle, T. and Cvetkovich, G. (1999).
Social trust and culture in risk management.
In Cvetkovih, G. and Lofstedt, R., editors, Social Trust and the
Management of Risk, pages 9–21. Earthscan.
Floridi, L. (2005).
The ontological interpretation of informational privacy.
Ethics and Information Technology, 7(4):185–200.
Hoffman, R. R., Lee, J. D., Woods, D. D., Shadbolt, N., Miller, J., and
Bradshaw, J. M. (2009).
The dynamics of trust in cyberdomains.
IEEE Intelligent Systems, pages 5–11.
Primiero
27 March 2014
3/7
References IV
Jamali, M. and Ester, M. (2010).
A matrix factorization technique with trust propagation for
recommendation in social networks.
In Proceedings of the Fourth ACM Conference on Recommender
Systems, RecSys ’10, pages 135–142, New York, NY, USA. ACM.
Jøsang, A., Marsh, S., and Pope, S. (2006).
Exploring different types of trust propagation.
In StÞlen, K., Winsborough, W., Martinelli, F., and Massacci, F.,
editors, Trust Management, volume 3986 of Lecture Notes in
Computer Science, pages 179–192. Springer Berlin Heidelberg.
Primiero
27 March 2014
4/7
References V
Marsh, S. and Dibben, M. (2005).
Trust, untrust, distrust and mistrust â an exploration of the dark(er)
side.
In Herrmann, P., Issarny, V., and Shiu, S., editors, Trust Management,
volume 3477 of Lecture Notes in Computer Science, pages 17–33.
Springer Berlin Heidelberg.
Mcknight, D. H. and Chervany, N. L. (1996).
The meanings of trust.
Technical report, MISRC Working Paper Series 96-04, University of
Minnesota, Management Information Systems Reseach Center.
Primiero, G. (2007).
An epistemic coonstructive definition of information.
Logique et Analyse, 200:391–416.
Primiero
27 March 2014
5/7
References VI
Primiero, G. and Raimondi, F. (2014).
A typed natural deduction calculus to reason about secure trust.
In Miri, A., Hengartner, U., Huang, N., Jøsang, A., and García-Alfaro,
J., editors, 2014 Twelfth Annual International Conference on Privacy,
Security and Trust, Toronto, ON, Canada, July 23-24, 2014, pages
379–382. IEEE.
Primiero, G. and Taddeo, M. (2012).
A modal type theory for formalizing trusted communications.
J. Applied Logic, pages 92–114.
Sztompka, P. (1999).
Trust: a sociological theory.
Cambridge University press.
Primiero
27 March 2014
6/7
References VII
Ziegler, C.-N. and Lausen, G. (2004).
Spreading Activation Models for Trust Propagation.
In Proc. IEEE International Conference on e-technology, e-commerce,
and e-service (EEE), pages 83–97, Taipei, Taiwan.
Best paper award for EEE2004.
Primiero
27 March 2014
7/7
Download