click here

advertisement
The Meanings of Trust
Trust is a concern for companies at many levels. I had the pleasure of doing a deep dive on the concept
of trust and its applicability to software and to software related businesses. Is trust a soft, nice to have
concept or is it critical to your business? Let's consider some of the business implications of trust and
distrust in eCommerce and in the software retail industry.

If a customer doesn't trust your company's brand, they will be reluctant to buy or may buy less
frequently.

If a customer doesn't trust your web site, they may transfer that distrust to the brand or may
make purchases through alternative means

If a customer does not trust themselves to use the web technology, they may forego the
purchase or make purchases through alternative means

If your web site presents an interaction with trust implications, with a trust decision and does
not give the potential customer the information necessary to resolve the trust decision, the user
may abandon the interaction. (The best path when software asks you to make a practically
impossible decision is to abandon the decision.)

If the customer does not trust your company not to read, sell or misuse your data, stored in the
cloud, they will not use your cloud services.
These are some of the kinds of reasons why trust issues are critical to eCommerce. It should be very
clear that trust is a signficant issue and not merely a nice to have consideration. The report below is a
partial summary of my investigation into the trust user experience.
Executive Summary
What is Trust User Experience? This working paper is intended to provide a useful definition for trust
user experience to influence research content, measurement, scenario development, research methods,
interpretation and communication of findings for the author and members of the group by analyzing the
meanings of the terms: trust user experience. Key implications from this analysis are provided after the
summary.
Trust User Experience
TUX is an individual’s reported experience of events when working with computers and, on the basis of
partial information, potentially exposing a value to harm through a possible vulnerability to risk of
attack.

An individual’s self-reported experience includes their perceptions of past, present or imagined
events, their perception of emotional reactions and cognitive impressions, and their related
internal conversation.

Their choice is based on their confidence in the expectation that the value will not be harmed
due to identification of the potential attacker as safe

The safe expectation is set by credible communications from or about the potential attacker, or
reliable past experiences with potential attacker(s) suggesting they do not pose a threat.

Attacks can target Security, Privacy, Reliability Accessibility, Usability or Geopolitical Strategy
Quality (SPRAUG) issues.

“Value” is any aspect of the user’s behavior, computer, software and data that the user would
protect from change if they knew that change was being made without their permission, e.g.,
access to the system, data integrity, code that runs on the system, user’s relationship to other
significant people, etc.
1. The context of TUX is humans working with computing systems where the systems and their data may
be at risk for security, privacy, accessibility, reliability, and geopolitical content quality issues.
2. Setting for User Stories -- The context of use is like a theme that sets the stage in time, place and
social situation for the objects and action of the many user stories which might be examined. These
stories may cover a range of durations and user life experiences from short term flashes of insight or
images through conversations to long term patterns of software usage over days / weeks / months.
Experiences are bounded by the environments, and the scope of their contents, within which they
occur. Environments may include: place, culture, language, social context, environmental objects
(hardware / software / documents), etc.
3. Trust is an emotion enabling one to expose a value through a potential vulnerability to a risk of harm
from possible attack on the basis of partial information about the threatener’s intentions, while
expecting no harm due to friendly social communications, safe reputation and / or reliable past
experiences with the potential attacker(s).
Distrust is an emotion inhibiting one from exposing a value through a potential vulnerability to a
possible risk of harm from possible attack on the basis of sufficient information about the threatener’s
intentions, to feel fear that the value may be harmed and to incent the user to take protective action.
4. The “users” in Trust User Experience are humans representing their evolutionary adaptive history,
cultures and other categories of individual variation, acting in a specified role within a TUX usage
scenario in a story of their life. TUX users may:

Work directly with computing devices and software,

Work indirectly with system inputs / outputs, or

Possess information needed to use computer systems that would enable an attacker to mount
an attack on a system, or may

Not use the hw/sw system or data at all, but may react to design, language, interaction choices
that conflict or are compatible with their cultural, political and belief systems

Have their presence extended into cyberspace by user social relationships and states on
software surfaces into info space via web 2.0 applications.
5. Experience is the user’s perception of the flow of events in real time, remembered or imagined as
they work with computers in the TUX context, including their own reactions and internal dialog about
their perceptions.
Key Implications of these Definitions
1. The TUX context includes individuals who do and do not work with the system and its applications.
We need to include scenarios for users who experience our design choices even indirectly. Limiting
ourselves to system users only can omit critical social engineering and geopolitical scenarios.
2. The “player in a story” orientation suggests that we should go to larger than normal efforts to place
users in trust research into realistic contexts. The fact that trust allows an individual to expose some
value to risk of harm by an attack implies that, for research, if the user does not feel that a value is being
exposed, they won’t experience trust emotions and the research may be invalid.
3. TUX user research procedures need to gather valid data on user emotion based learning and memory
as well as users judgmental / analytical interpretations of the UI and their actions. For example, the
user’s emotional and rational threat assessment systems may give different estimates for the probability
of threats and may influence each other. Humans are poor at estimating probabilities and our different
brains estimate threat probabilities differently. We need to understand the behavior of variables that
influence estimates and the factors that affect their accuracy.
4. Trust is an emotion experienced in a relationship between the trust grantor and trust requestor. TUX
research needs to examine patterns of verbal and non-verbal communication (rituals) whereby humans
create, validate, reject, maintain and recall relationships.
5. Trusting can be seen as a negotiation where terms and conditions are placed on the extension of trust
from one party to the other.
6. When we ask the user for a description of their experience we are asking them for a report from a
particular point of view. Three key trust experiences are:

Trust Safety Experience – Feelings of confidence and knowledge that potential threats have
been protected against so that they can feel safe without distracting themselves from their


digital lifestyle. Confidence that the values they may expose by using their computing devices
and applications are not at risk
Trust Threat Recognition – In the face of actual threats, the user recognizes the threat as a
threat or the warning as a sign of a threat and alerts themselves to take appropriate action. This
may mean interrupting his/her current task flow. The user experiences feelings of fear,
appropriate to the threat’s type, danger, proximity and its imminence. These feelings drive them
to take appropriate action to reduce the vulnerability and risk.
Trust Protection Experience – The user knowingly and confidently makes choices they believe to
be correct and takes action to protect against the threat. The user experiences a reduction in
the feeling of imminent danger from the trust threat. User feels no regrets after making the
choices and taking actions. The user experiences a reduction in their assessment of the risk
(subjective probability of a bad outcome).
7. Good TUX needs to exploit and guard against some characteristics of intuitive/ emotional based and
of rational / analytical based learning and memory


Emotional based learning:
o
Occurs very rapidly and on initial experience with events.
o
Can be very persistent and highly resistant to change.
o
Can lead reasoning to shape rationales for emotion based choices.
Rational/Analytical based learning:
o
Factual learning requires repetition and proceeds slowly.
o
Verbatim memory for facts is poor.
o
Factual memory is easily subject to error
o
Factual memory can be interfered with by emotional memory.
8. For TUX both for trusting and distrusting, we want to facilitate:

Trust Implications –
o
User capability to recognize, research, test and build safe relationships online and to
recognize and distrust unsafe ones,
o
User capabilities to gain valid, credible, reliable reputation information about potential
attacker so they can make wise decisions,
o
Assessment of past experiences with the potential attacker(s) suggesting they are safe
with tools that confirm the validity / falsity of those decisions;

o
Users to control their exposure of values (data, control, social information),
o
Users’ awareness of potential and known channels by which others can interact with
their systems and to establish controls over these channels so that the user is in control
of access and privileges,
o
User interest, emotional satisfaction and desire to use trust software/hardware
components to make and keep their system and data safe,
o
Users ability to acquire, retain and access knowledge required to efficiently and
effectively use trust software / hardware components.
Distrust Implications – we want to facilitate appropriate distrust protective actions, such as:
o
Reduce the vulnerability through which the threat maker can act; the weakness in the
defender’s guard, ex: blocking the phishing email’s domain or sending address,
o
Discourage or drive away the threat maker, changing their intentions, by making the
attack harder to succeed, more costly, less likely to succeed, more effortful, etc.,
o
Reducing the attacker’s will to initiate or maintain the attack, ex.: change systems to
charge money for spam; to identify originating device, address and router path; require
calculation by spam server for each email, etc.
o
Reduce the value’s exposure to the vulnerability; identifying threats (flag a phishing
letter as junk and move to junk folder),
o
Reduce the capability of the threat maker to implement an attack; deprive the attacker
of his weapons, logistical pathways to deliver attacks and other supports for mounting
attacks, ex: wall off suspicious code in a sandbox; change routing systems to identify
originating devices and addresses.
o
Identify the threat maker, charge them with crimes, bring them to justice, and publicize
their pain, suffering and penalties.
[GEW: this was an executive summary. Details supporting these observation are in the source
document and are not available.]
Download