Two Conceptions of Trust Online

advertisement
Dear Colleagues: Please forgive this sketchy draft. I look forward to your comments and
reactions which no doubt will contribute to development and improvement. Sincerely,
HFN
TWO CONCEPTIONS OF TRUST ONLINE
Helen Nissenbaum
Department of Culture and Communication
New York University
Paper prepared for the conference on
Trust and Honesty
Budapest Collegium
December 12-13, 2002
Introduction
This paper reflects on the subject of trust online and on trust in general to the extent the
more limited domain yields insights into the general. As a mere fragment, the online
context serves only as a partial microcosm, yet, as an instance of a particular kind of
phenomenon – a natural phenomenon which has evolved on top of a deliberately
Draft-2
3/8/16
1
constructed artifact -- it brings into relief (in more tractable form, perhaps) compelling
dimensions of trust. Two, in particular, will be the focus of my paper: a) a duality in core
conceptions of trust, one emphasizing the instrumental connections between trust and
assurance, the other emphasizing the uncertainty, risk, and freedom inherent in trust, and
b) the interdependence between technical decisions about system design and the merging
dominance, online, of one or other of these conceptions. My thesis is that instrumental
conceptions of, which appear dominant in the concerns over trust online, will tend to
squeeze out trust-as-freedom when embodied in our networked information systems.
Should this occur, something valuable will be lost as result.
Part One: Where Trust is a Means
One way to appreciate how trust emerges as a concern for participants in the development
and critical assessment of the online environment is through clusters of scholarly, trade,
and popular writings each focusing on particular aspects of online experience and
respective dimensions of trust. Clusters suggested by my own informal survey of these
literatures include: Computer and network security; E-commerce; Content; Interface
Design. In each case, trust had been identified as an explicit end, or goal, which could
and should shape their development. Brief descriptions follow.
1. Computers and network security
Draft-2
3/8/16
2
The field of computer security, which is now quite vast, grew in size and scope alongside
the developing fields of computing and information and communications technologies.
At some point along the way, perhaps when networked information systems overtook
stand alone computers as the focus of public attention, the purposes or ends of the field
found expression in terms of trust. That it, it became meaningful, even entrenched
convention, to refer to higher order goals of the community of engineers and scientists
within the branch (practice and research) of computer security to express their mission in
terms of trust. They sought to build “trusted” systems as well as “trusted” components of
systems and networks. I should note that for the most part it would be more accurate to
describe their purposes in terms of trustworthiness as this characterizes the state of the
artifact (worthy of trust, or not), while trust or trusted, according to standard usage, is
more fittingly applied to the attitude observers or users hold toward these artifacts. A
definitive report, Trust in Cyberspace (1999) of the Computer Science and
Telecommunication Board of the United States National Research Council undertakes to
revise usage within the technical security community to match the standard.
According to this same report, networked information systems (like the Internet) are
trustworthy in the extent to which they feature a constellation of properties, including
correctness, reliability, security, privacy, safety and survivability. In other words,
trustworthiness is conceived as an aggregate, or higher-order property, of systems that
manifest a constellation of other properties, presumably in some suitable degree and in
some satisfactory distribution. Each of the properties listed, though labeled in ordinary
terms and grounded in ordinary meanings have, over time, developed a quite specific,
Draft-2
3/8/16
3
technical character in the context of of computer security. (Privacy may be an exception.)
Security, for example, here conceived as a component of trustworthiness, itself refers to a
cluster of properties, usually including secrecy, confidentiality, integrity and availability
of data, integrity and availability of computer systems, and integrity and availability of
networks. Safety refers to a system’s or network’s capacity to buffer users from harm or
injury even when systems fail in some way; survivability (or robustness) to the capacity
of a system or network to function even when parts of it have been attacked or disabled;
and so on. Survivability has been of particular concern to experts who have keenly
observed the effects on the whole of a recent spate of attacks on key components, such as
Domain Name Servers, and routers. (References) Trustworthiness, and its aggregate
dimensions, applies not only to whole networks but to component parts. Thus, we may
speak, for example, of trustworthiness (or reliability, robustness, etc.) of networks
themselves, of network components, such as routers, servers, etc., of elements of the
physical infrastructure, or even of individual computers, or hosts, within a network.
To attempt more than a superficial survey of the components, or dimensions, of
trustworthiness in networked information systems would take us well beyond the scope
of this paper. Each component is itself the subject of large bodies of work, annual
conferences, government sponsored projects, and the sole focus of innumerable engineers
and technical experts working in industry. Furthermore, the interplay among the
dimensions, is itself complex and of significant interest; some dimensions of
trustworthiness are correlated, but not all are. In promoting some, we might need to trade
Draft-2
3/8/16
4
off others because of the possibility of interference, prohibitive cost, or technical
infeasibility.
Nevertheless, even setting aside the task of substantive elaboration, it is worth noting that
research and development focused on trustworthiness devolves to the task of developing
security mechanisms, each dedicated to some small part, or some small goal, developed
out of a need defined by the overarching goal of trustworthiness. Later in the paper, I will
discuss some of these mechanisms.
((Our distance from trustworthiness is sometimes expressed in terms of degree of risk,
that is, as a function of how far away we are in meeting goals implied by these various
dimensions of system trustworthiness. Note: an ambiguity can creep in here, as in other
contexts as trustworthiness need not imply perfection, it might simply mean that it makes
sense to trust, it is rational to trust, even though there’s a non-negligible risk of harm.))
A different way to characterize the quest for trustworthiness is in terms of threats. A
networked information system (like the Internet) is trustworthy to the extent it can
withstand, deflect, of minimize the impacts of breaches of the system, which can take on
a variety of forms from technical failure (or breakdown), incompetence of designers and
operators, and a surprisingly large array of malicious attacks. In the face of these
multifarious threats, trustworthy systems are those that remain available and true to
function; data both in storage and in transit is protected against undesired exposure and
corruption. As with clean air, a condition that mostly went unremarked until threatened
Draft-2
3/8/16
5
by growing pollution, so trustworthiness of our networked information systems is
appreciated in the face of increasing threats of failure and attack.
2. Electronic Commerce
Another domain where trust features prominently is electronic commerce (or,
ecommerce). Ecommerce, here, is taken to cover a variety of commercial exchanges
transacted via the Internet, including those where buyers place orders online for delivery
of physical goods, (Peets.com, Amazon.com, Ebay.com, etc.), where the goods delivered
are intangible (e.g. information, subscriptions to online publications), where what is
purchased is a membership of one kind or another (e.g. a pornographic site, a dating
service, etc.), and where a website provides services to customers (e.g. online bill
payment, online gambling, participation in game playing). These transactions could occur
between individuals and businesses or between businesses. Proponents of Internet
commerce had seen promise of great advantage to both consumers and businesses.
Despite the more muted hopes and expectations following failure of many dot.com
initiatives, many remain sanguine. Even the most optimistic, however, stress the
importance of trust to success, to the realization of promise. (For example, Resnick,
Zeckhauser, Friedman, and Kuwabara; Jones, Wilikens, Morris, Masera; Hoffman,
Novak, and Peralta)
Consider a few of the junctures where trust is crucial. Take a typical example of an
individual ordering physical goods from a merchant, who, for argument’s sake, does
Draft-2
3/8/16
6
business only online. The consumer must trust the merchant to charge the correct price,
provide goods of satisfactory quality, and deliver as promised. Merchants must trust
customers to pay and not repudiate their commitments. Consumers and merchants must
trust banks and credit card companies to convey funds and they, in turn, must trust
consumers and merchants. All parties must trust the competence and integrity of vendors
of ecommerce systems -- that the systems perform their functions reliably and safely,
and, on the minds of concerned consumers, have the capacity to securely transmit and
store critical sensitive information like credit card numbers, personal information about
consumers, including the content of their orders. Finally, trust in ecommerce, presumes a
degree of trust on the Internet, generally, as discussed earlier. One could go through a
similar analysis of all the modes of ecommerce transactions.
Many issues raised in the context of ecommerce face parties in traditional commerce as
well, such as trusting a merchant to sell high quality goods, or not to overcharge credit
cards, though we are likely to be more familiar and comfortable with policies and
strategies to deal with risks and failures. Some, however, like those involving
trustworthiness of the underlying protocols and systems are unique. And others are novel
because they involve entities or transactions that deviates somewhat from the familiar.
Although auctions are not new, for example, online auctions such as those conducted
through eBay, are different in detail in ways that might make a difference in degree of
trust. Purchasing travel reservations online from companies we may not previously have
known, raises similar concerns of sheer novelty. For example, downloading information
Draft-2
3/8/16
7
or software from a dot.com company might (or at least should) raise concerns over
unwanted code that might be hidden within it.
Ecommerce is undergoing rapid evolution as individuals and businesses learn from their
own experiences and those of others, at the same time that legal and financial institutions
adapt old and craft new policies and practices for them. One important example of
adaptation was legal recognition given (through an act of the U.S. Congress) to digital
signatures. We have also seen evolution in societal responses to “fringe” commerce in
such areas as online gambling, pornography and other “adult” offerings, the latter
constituting a large proportion of online activity. I should qualify that I use “fringe” to
contrast these activities with culturally more central activity, not with size as adult
content offerings have proven enormously popular. These fringe websites are of
particular interest because they are facing changes attributed to trust, or rather, lack of
trust. For a start, Internet gambling has been all but squelched by credit card companies
refusing to honor gambling debts incurred online. These companies have also begun
levying surcharges on adult-content sites, defending their actions against charges of
private censorship, by pointing to higher incidence of untrustworthiness in both the
operators of adult websites (who regularly engage in dishonest billing practices, like
charging credit cards even after subscriptions are cancelled) and in consumers, who
(perhaps out of remorse or fear) regularly deny prior associations with these sites and
repudiate charges (resulting in costly “chargebacks”). (See “Credit Cards Seek New Fees
on Web’s Demimonde,” by Matt Richtel and John Schwartz, The New York Times,
November 18, 2003)
Draft-2
3/8/16
8
3. Interface Design and Content
Two other topics that have generated clusters of work on trust are interface design and
content. I provide a brief overview of each.
Content. The intention behind creation of the World Wide Web was to build a system of
ready access to information stored in computers around the globe (Berners-Lee). The
Web would take advantage of Internet connectivity but would create standards for
sharing information to overcome existing barriers to free flow. Our experience of the
Web and Internet reflects the enormous success of this effort, which provided access not
only to established institutions, like the ones with which Berners-Lee began, but to an
enormous array of individuals and collectives from multinational corporations to
governments to renegade groups and individuals. The results are breathtaking in size and
scope.
At the same time that this freedom to share and find information, the absence of global
gatekeepers, the unconstrained access to publishing leads to a vast and varied trove, it
also results in content of varied quality. In referring here to quality, I do not mean to
imply a single, linear, standard by which content may be measured. I am suggesting,
however, that commitment to any standard or standards will lead to a set of question or
worries about what content is credible and can be trusted and what is not and cannot. One
can imagine this question ranging over a broad array of possibilities from announcements
Draft-2
3/8/16
9
of events to claims of miraculous medical cures to world news to scientific reports to
financial offers and more. Individuals struggle with these questions (though perhaps not
as much as they ought to), information scientists pose them as the fundamental challenges
of their profession.
Interface Design. Research on website design, particularly ease of use and appeal of
interface, has also considered systematic relationships between design and trust. Goals of
this line of research are somewhat different from the others as they are not necessarily
focused on how trustworthy systems, commercial offerings, and content are but whether
users will in fact extend trust to them, whether users will perceive them as trustworthy. In
other words, the questions asked by those interested in usability and interface design
include how to frame information, what sort of information to provide, what aesthetic or
visual factors have bearing on users’ responses to an online presence, offering, or
invitation.
Some of the efforts to shape user experience and reaction through website and interface
design draws on findings from other social and commercial experiences. In one case, for
example, on-screen, animated characters representing real-estate agents were
programmed to follow interactional rituals like pleasant greetings including small-talk
that referred to stories of past benevolence. Studies indicate that such techniques increase
users’ trust in these systems. (Cassell and Bickmore) Or, in other cases, experts advise
designers via guidelines and principles such as disclosing patterns of past performance
(e.g. an airline reservation system) through testimonials from other satisfied users,
Draft-2
3/8/16
10
making policies clear and salient, avoiding misleading or ambiguous language, and so
forth. (Friedman, et. al., Schneiderman, B.J. Fogg, et.al.)
Findings on the factors that induce trust, or distrust, in computer and Web users though
related to trust online generally, in the end, is probably best handled as a separate inquiry.
While the latter focuses on trustworthiness and trust, the former (study of interface
design) focuses primarily on how to seem trustworthy no matter what the underlying case
may be. Those who study user related design factors generally acknowledge the moral
ambiguity of their craft and encourage readers to take the moral high road along with
them.
Achieving Trust Online
Before considering responses to these challenges, it is worth noting, as should be evident
from the cases above, that trust online is not a wholly new species of trust; challenges are
quite similar in form to those we encounter more generally within both specialized
relationships and in society at large. There is value, however, in dealing with trust online
as a distinct topic of study because the online context which, in some sense, constitutes a
simplified model of life in general, highlights particular aspects of trust that can be lost in
the “noise.” There are, furthermore, historical and cultural dimensions of the online
context that are relevant to our understanding of trust and distrust. Finally, because the
online context is layered upon an artificial or constructed technological infrastructure, it
Draft-2
3/8/16
11
allows for, at least in principle, trial and experimentation in ways not as readily available
in the thick world of physical being.
Many (though admittedly not all) of the responses to trust-related challenges posed earlier
can be understood as ways of coping with several of the distinctive properties of online
experience. In an earlier paper, I described obstacles to trust online, in particular,
properties and conditions, absent or attenuated online, that typically form the basis for the
formation and maintenance of trust attitudes and judgments. One is that we may not
know the true identities of those (people and organizations) with whom we interact, a
condition that interferes with the possibility of learning from experience (one’s own and
others), establishing reputations, and building cooperative and reciprocal relationships.
Another is that the disembodied, mediated nature of relationships online lack sensory
information about personal characteristics that cue us to trustworthiness of others.
Finally, contexts and opportunities online remain somewhat ill-defined, ambiguous,
inscrutable; we remain unsure of meanings, responsibilities, and norms. (For more detail,
see Nissenbaum.)
With these features of online experience as a backdrop, it is easier to understand the
rationale behind various approaches that have been taken with the purpose of achieving
trust, or increasing the level of trust, online within the areas mentioned above.
Approaches, which include technical, institutional, and legal, attempt in one way or
another to address, or compensate for, potentially limiting dimensions of the relevant
features of online experience. I don’t mean to imply the existence of a unified, organized
Draft-2
3/8/16
12
attempt to address the problem of trust online. Rather, what we see are numerous efforts - some independent others building upon each other – to produce a set of mechanisms
which each tackle some small element of aspect of the larger issue of trust online.
In the arena of trust in computer systems and networks, for example, it has been
predominantly scientists, engineers, research sponsors, and government agencies, who
have focused on approaches to creating more robust systems by such means as
redundancy for critical elements, mechanisms for detecting unusual patterns of activity
(that might signal attacks), capacity in the system as a whole to overcome untrustworthy
components, and so on. They fortify systems against attacks with layers of security:
passwords, firewalls, restrictions designed into computer languages (like JAVA) to limit
what programmers can and cannot accomplish (not something generally grasped or even
noticed by most non-technical users), limiting access to authorized people and code.
Another part of the picture has been security of network connections, or pipelines to
protect information in transit against theft or alteration, to ensure greater confidence in
the integrity of information flowing throughout our networks. Secure transmission is
considered an essential element for trust in ecommerce as consumers will want to avoid
theft or tampering of sensitive information like credit card numbers, bank transactions,
and passwords.
Mechanisms to restore identity have become increasingly sophisticated. So-called “trust
management systems”, for example, seek to establish reliable links between parties -–
individuals and organizations – the commitments they make and actions they undertake
Draft-2
3/8/16
13
online. These systems typically involve unique identifiers (for example, digital
signatures) for participants backed by trusted intermediaries (“signature authorities”)
who vouch for the validity of the linkage. As biometric schema become cheaper and
more effective, we are likely to see their use in establishing even tighter and sounder
links between online presence and identity. It is worth remarking that given massive
databases of personal information, which have been accumulated and aggregated over
years of data surveillance, identification links parties not only with particular actions
online but with rich profiles as well. Thus, identification mechanisms promise to increase
our capacity not only to hold parties accountable for wrongdoing but in the first place, to
predict, more reliably, what they are likely to do.
Mechanisms that enable sound (accurate and reliable) identification are fundamental to
many other mechanisms that build upon them to address a series of specialized problems
related to trust in the various arenas discussed above. One controversial application has
emerged in the context of the content industries (movies, music, fiction, art, etc.). As has
been amply documented in legal disputes, scholarly literature, as well as popular news
media, a highly distrustful relationship has developed between the industry and its mass
audiences. (refs.) From the perspective of the industry, viable systems of Digital Rights
Management (DRM), which necessarily build upon the capacity to identify users as well
as content, are necessary to rebuild trust. With DRM systems they will be able to keep
track of sales and usage and put a stop to peer-based unauthorized sharing.
Draft-2
3/8/16
14
Another family of mechanisms that have shown promise for increasing trust online are
those that support reputation systems. Although these systems do require extended,
unique, identities to which reputations attach, these identities might not necessarily link
to anything outside of a given context. Clearly, reputation is not distinctive of the online
context, but computational and network capabilities enables a variety of options including
some that would be virtually impossible to implement offline. Some reputational systems
have familiar form – individuals and organizations that undertake the role of trusted
authority promulgating their appraisals of others – individuals, services, businesses,
information, etc.
These systems may range in focus, from the most specialized, like TRUSTe, which offers
seals on the basis of their privacy policies, to the most general, like search engines.
Among the most novel and interesting to proponents of online experience are peer-based
reputation systems not based on authority or expert appraisals but upon a computed
aggregate of peer appraisals. (Zagat’s restaurant guides may be an offline equivalent.)
EBay’s Feedback Forum, which allows buyers and sellers on this massively popular
online auction site to rate each other following transactions. Anyone considering whether
to engage in a transaction with another via eBay can check the other’s Feedback Forum
reputation, though, as shown by Resnick, et. al. reputation systems are far from perfect,
even ones as mature as Feedback Forum. Because reputation systems can apply to
people, as well as organizations, information, political opinions, and more, they offer
promise for various of the contexts (above) in which the issue of trust arises.
Draft-2
3/8/16
15
Trust and Assurance
Readers who have tracked developments in new media and are familiar with the broad
spectrum of social, technological, and creative possibilities online will probably complain
that I have omitted a set of cases that get closer to the heart of trust than any that I have
mentioned so far. They would be right. But before I turn to those cases, and to the second
conception of trust, let us review features common to those discussed already and how
they contribute to a conception of trust as a means.
In the cases described earlier and in many others bearing no connection to the online
context, trust functions to close a gap of uncertainty. In these cases a person is challenged
to take action or make a decision without having all the relevant information to be able to
know with certainty (or a high degree of certainty) what the outcome of her or his action
will be. We place orders online not entirely certain that the businesses are “real”, whether
we have recourse if they take payment and fail to deliver, whether they will respond to
complaints if the items we buy are damaged. We avidly absorb information about cancer
treatments, availability of drugs not yet approved by the FDA, the weather in Bangladesh,
a new failsafe diet. We visit pornographic and other websites, place bets, download free
software and songs, enter competitions, make airplane reservations, get legal advice,
open up email messages from people known and unknown to us.
Most of us are aware that there is risk involved in all these activities – in some more than
in others. Yet we undertake these actions (or decisions) with an attitude of trust (or, if you
Draft-2
3/8/16
16
prefer, a decision to trust), deliberately placing our fates in the hands of others who have
the capacity to harm us in some way. As such, trust acts as a bridge between uncertainty
and action.
What I would like to suggest, is that for cases of the kind that I discussed above, trust is a
compromise. In an ideal world, or an ideal state-of-affairs, we would be assured that
networks will not fail, that we will not be attacked by criminals and viruses online, that
financial transactions will go through as intended. We would be certain that our messages
go where we send them and retain confidentiality and integrity. We would be assured of
reliability of information we access, of the honesty of online merchants. But, in reality,
we cannot even hope for these assurances and so in order to function, we need to make
calculated judgments, trusting some, distrusting others. We do not necessarily trust
blindly, but will invoke the experiences and strategies cited by many philosophers and
social scientists who have studied trust. (These claims skirt on deep conceptual issues
which will require further development and defense.) Proponents of online resources, too,
will build on insights about trust in order to engender a climate of trust and encourage
trust, and therefore, participation online.
This conception of trust is somewhat humbling in that, as with other instrumental goods,
it grounds the value of trust other ends, leaving open the possibility that to the extent
these ends can be achieved other ways, trust conceivable loses its value. There are
already cases where we reject trust in favor of other modes of assurance. Such is the case
at airports, for example, where (I would argue) we have no interest in trust but prefer
Draft-2
3/8/16
17
perfect assurance that fellow passengers will not be able to carry out lethal plans. The
same is true, in general, of so-called “high security” zones. (This point is discussed also
in my earlier paper.) Under what I have called an instrumental conception of trust, the
distance between high security zones and online cases described above is a measure not
of moral but of economic difference. Richard Posner’s rebuttal of Charles Fried’s defense
of a strong but limited right to privacy is perhaps emblematic of this instrumental view of
trust. Posner rejects Fried’s claim that privacy is the necessary context for both intimacy
and trust not by denying the connection between privacy and, in this instance, trust, but
by denying the importance of the loss: “trust, rather than being something valued for
itself and therefore missed where full information makes it unnecessary, is, I should
think, merely an imperfect substitute for information.”1
Part Two: Where Trust is the Point
In Part One, I reviewed a number of arenas in which trust has been explicitly
acknowledged for its potential to undergird investment of one kind or another in the
online promise. They shared an instrumental conception of trust. Here, I turn attention to
a more difficult arena, where trust is clearly as important but tends to feature implicitly
rather than as an explicit subject of systematic study. To learn more about this arena, we
need to move away from the realms of Ecommerce, broadcasters, and mainstream
information providers to the helter-skelter of some of the novel life-forms that the system
1
Richard Posner, The Right of Privacy, 12 GA. L. REV. 393, 408 (1978).
Draft-2
3/8/16
18
of networked computers has borne. Although there is no simple way to characterize the
entire range I mean to cover, certain features stand out.
The activities tend to be outside the for-profit sector, tend not to be associated with
corporate or established political presences, tend to involve peer-based interaction rather
than classic hierarchy or authority. Though they extend across a wide variety of
substantive interests from political to recreational to technical, they tend not to involve
relationships of buying and selling. Examples of such activities are the myriad online
communities that consolidate around grassroots cultural, gender, environmental issues
and political interests facilitated by software systems that allow for the formation of such
groups online. (Slashdot, ethePeople, Institute for Applied Autonomy, etc.) It includes
also communities drawn together in recreational activities, MUDs (Multi-User
Dungeons), MOOs (Multi-User Object-Oriented domains), collaborative art sites, peerto-peer music exchanges, instant-messaging, and more. Some revolve around the
planning of civic action, the building of civic, geographically-bounded, communities, the
promotion and protest of civic causes and more. Their mode of interaction (or
performance) can be varied, too, including discussion, organization, game-playing,
information production, protest, voting, and linking. (Note: This list needs a great deal
more research and refinement.)
Although it is important to understand whether and in what ways these online efforts and
communities are unique, distinctive, unprecedented, it is also useful to consider ways,
relevant to trust, in which they are continuous with analogous efforts and communities
Draft-2
3/8/16
19
not mediated or enabled through the Internet. In particular, it is useful to highlight their
connection with traditional ideals of human relations embodied in friendship, citizenship,
community, camaraderie, family, and more. Trust within these relationships is not
predominantly instrumental as it is in the contexts earlier discussed, but rather, in
significant ways, trust is the point. Trust is an essential part of the ideals toward which
relationships of these kinds strive.
There are, no doubt, many heroic and legendary stories that illustrate the essential
connection between trust and these relationships but for purposes of this paper, let us
consider a somewhat obscure case from the world online. The case, Telegarden,
described in a study of online collaborative art sites2, is typical of the fluid borders
between online activity and its physical manifestations. The Telegarden website
mediates interactions among a community of participants who are dedicated, primarily, to
the cooperative design and maintenance of a garden. The garden itself is “real”, that is to
say, it features six to seven square feet of cultivated soil, currently on exhibition in the
Austrian museum, The Ars Electronic Center.
The garden is cultivated by robot (with occasional help from museum staff), which is
carries out commands of the Telegarden site participants. The robot can be commanded
to plant seeds in specific locations, water the garden, and provide visual feedback to
participants. The Telegarden website and robot, created in 1995 by a team of computer
2
I am grateful to Gaia Bernstein for sharing this case with me. It is part of a
larger collaborative book project of the Information Law Institute at NYU.
(Details forthcoming.)
Draft-2
3/8/16
20
science and engineering faculty of the University of Southern California, is now fully
controlled by participants who “visit” the garden frequently to perform maintenance, see
the garden, and mingle with friends they have made on the site. Every few months, when
the garden overfills with plants, the museum staff replace it with fresh soil and a new
growing season is declared.
The Telegarden project can be studied from a variety of perspectives, each focusing on
particular dimensions: its technical features, its mode of governance, the nature of the
community that has grown around it, why individuals derive a sense of enjoyment and
satisfaction from it, and the aesthetic character of each completed garden. For purposes of
this paper, let us focus on an aspect of the site’s governance (as embodied partly in its
technical design and partly in explicit norms.) Though participants can plant only after
they have served as members (which means registering by full name and email address)
for a sufficient length of time, measured by the frequency with which they water the
garden responsibly and post messages in the chat rooms, there are few restrictions
thereafter.
This freedom means that the garden is vulnerable to sabotage such as over-watering and
planting seeds over seeds already planted. There is also no censorship of materials posted
to the community chat room. On rare occasions, when members over-watered or posted
pornographic pictures to the chat room, organizers chose not to alter the sites mechanisms
to make over-watering impossible or cancel the picture uploading function. This is
crucial. Organizers could have prevented damage to the garden by an alteration to system
Draft-2
3/8/16
21
software but they chose not to. In so doing, and in this small way, they declared trust as a
defining character of the community. One could imagine a version of Telegarden whose
system restricted member-actions, making it impossible for them to harming the garden
but it would be significantly different, not as interesting, not as compelling, not as
exhilarating. The reason is that in the existing version of Telegarden, trust is not simply
endured as an imperfect substitute for information, trust is the point.
Conclusion: Significance for Design
Based on a review of formal and informal studies of the online milieu, I have posited the
presence of (at least) two distinct conceptions of trust. In one, trust stimulates
participation and investment online by emboldening people and organizations to act,
instead of freezing in the face of risk and uncertainty. In another other, trust is a defining
characteristic of relationships and communities and is a part of what induces participation
and contribution to them. (Neither of the conceptions is necessarily tied to the online
context.) One way to tell these two conceptions apart is to consider how a typical
situation in which each operates, respectively, would evolve in the ideal. In the first, the
ideal of a typical situation, say a commercial transaction, is when participants are fully
apprised of all necessary details and are certain that they face no danger of harm. In the
second, the ideal is a firmly entrenched resolve to accord participants the freedom to act,
even to harm. Victory is not certainty of a good outcome, it is unbending commitment to
freedom of action.
Draft-2
3/8/16
22
Under the first conception, designers and developers of the networked information
infrastructures strive to build in tight restraints, identifiability, complete surveillance, and
accountability -- the analog to the perfect airport security system. The important question
is whether choosing this alternative will bring too harsh a glare to the range of activities
where trust is essential and whether, if it does, the tradeoff is acceptable.
Partial Reference List
Berners-Lee, T. (with Mark Fischetti) (1999) Weaving the Web: the Original Design and
Ultimate Destiny of the World Wide Web by Its Inventor. San Francisco: HarperCollins
Publishers
Camp, J. (2000) Trust and Risk Cambridge, Mass.: The MIT Press.
Cassell, J. and Bickmore, T. (2000) “External Manifestations of Trustworthiness in the
Interface” Communications of the ACM, Vol. 43, No. 12 (December 2000) 50-56
COMMISSION ON INFO. SYS. TRUSTWORTHINESS, NATIONAL RESEARCH COUNCIL, TRUST IN
CYBERSPACE 1 (Fred B. Schneider, ed. 1999)
Corritore, C.L., Kracher, B., Wiedenbeck S. (2001). Trust in the online environment. In
M.J. Smith, G. Salvendy, D. Harris, and R.J. Koubek (Eds.), Usability Evaluation and
Draft-2
3/8/16
23
Interface Design Cognitive Engineering, Intelligent Agents and Virtual Reality (pp.15481552). Mahway, NJ Erlbaum.
Fogg, B.J., Soohoo, C., Danielsen, D., Marable, L., Stanford, J., & Tauber, E. (2002).
How Do People Evaluate a Web Site’s Credibility? Results from a Large Study. Stanford
Persuasive Technology Lab, Stanford University. Available at
http://www.consumerwebwatch.org/news/report3_credibilityresearch/stanfordPTL_abstra
ct.htm or http://credibility.stanford.edu/mostcredible.html.
Friedman, B., Kahn, Jr. P. and Howe, D. (2000) “Trust Online,” Communications of the
ACM, Vol. 43, No. 12 (December 2000) 34-40
Russell Hardin, Trustworthiness, 107 ETHICS 26, 33 (1996)
Hoffman, D., Novak, P. and Peralta, M. “Building Consumer Trust Online,”
Communications of the ACM 42, 4 (Apr. 1999), 80-85
Jones, S., Wilikens, P.M., Masera, M., (2000) “Trust Requirements in E-Business,”
Communications of the ACM 43, 12 (Dec. 2000), 80-87
Kim, J., & Moon, J. Y. (1998). Designing towards emotional usability in customer
interfaces--trustworthiness of cyber-banking system interfaces. Interacting with
Computers, 10(1), 1-29.
Draft-2
3/8/16
24
NIKLAS LUHMANN, Trust: A Mechanism for the Reduction of Social Complexity, in TRUST
AND POWER: TWO WORKS BY NIKLAS LUHMANN
8 (photo. reprint 1988) (1979)
Manz, S. (2001). The Measurement of Trust in Web Site Evaluation.
http://www.swt.iao.fhg.de/pdfs/Manz2002_Trust.pdf
Nissenbaum, H. (2001) “Securing Trust Online: Wisdom or Oxymoron?” Boston
University Law Review, Volume 81 No. 3, 635-664
Philip Pettit, The Cunning of Trust, 24 PHIL. & PUB. AFF. 202, 204-05 (1995)
Richard Posner, The Right of Privacy, 12 GA. L. REV. 393, 408 (1978).
Resnick, P., Kuwabara, K., Zeckhauser, R., Friedman, E., “Reputation Systems.”
Communications of the ACM 42, 12 (Dec. 2000) 45-48
Schneiderman, B. “Designing Trust into Online Experiences,” Communications of the
ACM, Vol. 43, No. 12 (December 2000) 57-59
ADAM B. SELIGMAN, THE PROBLEM OF TRUST 19 (1997) (arguing that trust in systems
entails confidence in a set of institutions).
Draft-2
3/8/16
25
Shelat, B. & Egger, F.N. (2002). What makes people trust online gambling sites?
Proceedings of Conference on Human Factors in Computing Systems CHI 2002,
Extended Abstracts, pp. 852-853. New York
Draft-2
3/8/16
26
Download