THEORY AND REVIEW
INFORMATION PRIVACY RESEARCH:
AN INTERDISCIPLINARY REVIEW1
H. Jeff Smith
Department of Decision Sciences and Management Information Systems, Farmer School of Business,
Miami University, Oxford, OH 45056 U.S.A. {jeff.smith@muohio.edu}
Tamara Dinev
Department of Information Technology & Operations Management, College of Business,
Florida Atlantic University, Boca Raton, FL 33431 U.S.A. {tdinev@fau.edu}
Heng Xu
College of Information Sciences and Technology, Pennsylvania State University,
University Park, PA 16802 U.S.A. {hxu@ist.psu.edu}
To date, many important threads of information privacy research have developed, but these threads have not
been woven together into a cohesive fabric. This paper provides an interdisciplinary review of privacy-related
research in order to enable a more cohesive treatment. With a sample of 320 privacy articles and 128 books
and book sections, we classify previous literature in two ways: (1) using an ethics-based nomenclature of
normative, purely descriptive, and empirically descriptive, and (2) based on their level of analysis: individual,
group, organizational, and societal.
Based upon our analyses via these two classification approaches, we identify three major areas in which
previous research contributions reside: the conceptualization of information privacy, the relationship between
information privacy and other constructs, and the contextual nature of these relationships.
As we consider these major areas, we draw three overarching conclusions. First, there are many theoretical
developments in the body of normative and purely descriptive studies that have not been addressed in empirical
research on privacy. Rigorous studies that either trace processes associated with, or test implied assertions
from, these value-laden arguments could add great value. Second, some of the levels of analysis have received
less attention in certain contexts than have others in the research to date. Future empirical studies—both
positivist and interpretive—could profitably be targeted to these under-researched levels of analysis. Third,
positivist empirical studies will add the greatest value if they focus on antecedents to privacy concerns and on
actual outcomes. In that light, we recommend that researchers be alert to an overarching macro model that
we term APCO (Antecedents  Privacy Concerns  Outcomes).
Keywords: Information privacy, multi-theory, regulation, society, interdisciplinary
1
1
M. Lynne Markus was the accepting senior editor for this paper. Paul Pavlou served as the associate editor.
The appendices for this paper are located in the “Online Supplements” section of the MIS Quarterly’s website (http://www.misq.org).
MIS Quarterly Vol. 35 No. 4 pp. 989-1015/December 2011
989
Smith et al./Information Privacy Research
Introduction
Information privacy is of growing concern to multiple stakeholders including business leaders, privacy activists, scholars,
government regulators, and individual consumers. Public
opinion polls report that privacy is one of the largest concerns
for consumers. For instance, a Consumer Reports poll
revealed that “72 percent are concerned that their online
behaviors were being tracked and profiled by companies”
(Consumers-Union 2008).
To a great degree, these consumer worries are grounded in the
growing “art of the possible” in the technological realm. The
spread of ubiquitous computing and the seemingly unbounded
options for collecting, processing, distributing, and using
personal information trigger consumer worries. In a recent
study analyzing the organizational privacy practices of the top
50 most visited websites, Gomez et al. (2009) found that most
of these websites use personal information for customized
advertising, and a large number of reputable firms like
Google, Yahoo, Microsoft, and Facebook share their collected
customer data with hundreds of their affiliated companies.
Along with this use and sharing of data comes an associated
risk: in the Ponemon Institute’s 2007 survey with a sample of
786 American consumers, it was found that 62 percent of
respondents had been notified that their confidential data was
lost or stolen and that 84 percent of these consumers
expressed increased concern or anxiety due to the data loss.
Against this backdrop, the number of privacy-related research
contributions has grown significantly in recent decades.
However, Information Systems (IS) researchers who wish to
examine topics related to information privacy may well find
themselves frustrated with the research domain. Numerous
studies—a number of them based on compelling theoretical
frameworks and undergirded by sound, rigorous, methodology—have been published over the past few decades.
However, the findings and the theories that emerged have
often relied on overlapping constructs nestled within loosely
bounded nomological networks. This has resulted in a suboptimal cumulative contribution to knowledge.
In historical terms, it is now widely recognized that the recent
evolution of the concept of privacy in general—and information privacy in particular—follows the evolution of information technology itself, as can be seen in Table 1.
While it is impossible for any discipline to claim ownership
of the research concept of privacy in general, we believe that
IS scholars’ contributions to information privacy have been,
and will continue to be, very important in shaping modernized
information privacy conceptualization during the “third era.”
990
MIS Quarterly Vol. 35 No. 4/December 2011
In an attempt to assist the IS research community in providing
a more cohesive treatment of information privacy issues, we
first examine the privacy-related research that has been
published to date. We consider the strengths and weaknesses
of this research stream, and we then provide guidance
regarding future directions for research. We rely on the following assumptions:
(1) The ultimate target of our review is information (rather
than physical) privacy.
(2) Our target audience consists of IS scholars interested in
empirical information privacy research.
(3) These scholars will be best enlightened by guidelines that
are grounded in an interdisciplinary review of prior work.
(4) It is not our goal to build and propose a comprehensive,
broad model of information privacy.
While we will provide some guidelines and outline the
difficulties accompanying such an effort, it is well beyond the
scope of our research endeavor to create such a meta-theory.
Literature Review
This literature review deals directly with information privacy
as opposed to physical privacy. The latter concerns physical
access to an individual and/or the individual’s surroundings
and private space; the former concerns access to individually
identifiable personal information. Historically, the concept of
physical privacy was explicated first. Later, as it became
apparent that information about individuals and groups
(especially families and organizational teams) was gathering
saliency, information privacy was subsumed under the larger
umbrella of general privacy.
In the initial period of this transition, physical privacy concepts and definitions were directly and seamlessly applied to
information privacy, without reported contradictions. The
continuity of the transition from physical privacy to information privacy allowed scholars to embrace the earlier adopted
definitions and to carry them through specific contexts and
cases associated with information privacy. More recently,
however, nomological models associated with information
privacy have been targeted directly to that construct. Thus, in
a manner parallel to that of privacy research development, we
will apply early privacy concepts to information privacy, and
we will also analyze information privacy-specific concepts.
Smith et al./Information Privacy Research
Table 1. Evolution of the Information Privacy Concept Following the Evolution of IT (adapted from
Westin 2003)
Period
Characteristics
Privacy Baseline
1945-1960
Limited information technology developments, high public trust in government and business
sector, and general comfort with the information collection.
First Era of
Contemporary Privacy
Development
1961-1979
Rise of information privacy as an explicit social, political, and legal issue. Early recognition of
potential dark sides of the new technologies (Brenton 1964), formulation of the Fair Information
Practices (FIP) Framework and establishing government regulatory mechanisms established
such as the Privacy Act of 1974.
Second Era of Privacy
Development
1980-1989
Rise of computer and network systems, database capabilities, federal legislation designed to
channel the new technologies into FIP, including the Privacy Protection Act of 1984. European
nations move to national data protection laws for both the private and public sectors
Third Era of Privacy
Development
1990-present
Rise of the Internet, Web 2.0 and the terrorist attack of 9/11/2001 dramatically changed the
landscape of information exchange. Reported privacy concerns rose to new highs.
We note that the distinction between physical and information
privacy is seldom clarified in public debate or, for that matter,
in many areas of research. For example, comments about
privacy violations in the public media seldom draw a clear
distinction between the constructs of physical and information
privacy. Most of the questionnaires embraced in the cited
literature ask about privacy rather than information privacy,
as do the general surveys of polling agencies. In a related
vein, popular terms and names of organizations, such as
privacy advocates, Privacy International, World Privacy
Forum, etc., are unclear in the distinction. The situation is the
same with the legal language and the laws in many countries.
In an attempt to provide as direct a treatment of the concepts
as possible, we will follow the following principle throughout
the remainder of this paper: we will use the term privacy as
a reference to information privacy, which is our immediate
focus. We will occasionally refer specifically to physical
privacy or to general privacy (which includes both physical
and information privacy), and we intend those references to
be distinct from those to (information) privacy.2
2
In many cases, we have had to rely on our own interpretations of other
authors’ covert assumptions about the distinctions as we classify and interpret
their writings. Our heuristic is as follows: if a particular article/book (or
group of articles/books) is clear in its orientation toward either physical or
(information) privacy, we have categorized it as such. If the referent is either
overtly stated as encompassing both domains, or if there is no statement but
the author’s covert assumptions appears (in our opinion) to encompass both,
we have categorized it as general privacy.
Although general privacy is broadly multidisciplinary in
nature, each discipline offers a unique angle and perspective
following its own accepted methodologies and discovery
processes. Even so, across numerous disciplines, general
privacy is often subsumed under the rubric of ethics. Indeed,
in most encyclopedias, textbooks, and sociology readings
(e.g., Bynum 2008; Pearlson and Saunders 2009), the topic of
privacy is often found in the “ethical issues” chapter. As we
will see below, general privacy beliefs are integrated in the
moral value system of the society, and it is therefore natural
that general privacy would be seen as an ethics topic. Consistent with this approach, we will classify general privacy
studies using an ethics-based nomenclature of normative,
purely descriptive, and empirically descriptive (Copp 2007;
Singer 1991; Walsham 1996; Werhane 1994).
The word normative refers to guidelines or norms, so normative assertions rest upon ethical commitments and behaviors
to be prized, preferred, and valued; they are often called ought
statements. Examples of normative general ethical theories
are Kantian ethics, virtue ethics, and utilitarian ethics.
Examples of normative business ethics theories are stockholder, stakeholder, and social contract (Pearlson and
Saunders 2009). In an overarching sense, descriptive studies
attempt to explain what is rather than what ought to be.
Descriptive studies can be purely descriptive (simple statements of fact) or empirically descriptive (tests of theories/
frameworks utilizing positivist, scientific methods).3
3
Although a particular study can be classified as primarily normative, purely
descriptive, or empirically descriptive, the treatments do sometimes overlap,
and in each a part of the other can sometimes be found (Werhane 1994).
MIS Quarterly Vol. 35 No. 4/December 2011
991
Smith et al./Information Privacy Research
Published studies can thus be classified as normative, purely
descriptive, or empirically descriptive. The normative publications provide an ought argument for how the world should
be, according to society’s or the author’s value system. Many
(but not all) normative publications involve some statement of
the author’s opinion regarding an issue; per Herson (1990),
some of these normative opinions may be buttressed by
descriptive or empirical statements, but the overall tone of the
argument is normative in its nature.
The purely descriptive studies are characterized by descriptions of a state of affairs, with no rigorous attempt to infer
causality or to provide a process mapping. Empirically
descriptive studies often attempt to provide tests of relationships between constructs or to map processes. For example,
a study that tested a nomological model that showed how
various psychological constructs such as cynicism and
paranoia were associated with privacy concerns would be
empirically descriptive.4 Similarly, interpretive process
tracing studies would also fall under the rubric of empirically
descriptive studies.
In addition to their categorization as normative, purely
descriptive, or empirically descriptive, general privacy studies
can also be classified based on their level of analysis: individual, group, organizational, and societal (both crossnational and cross-cultural). In reviewing the literature by
level of analysis, we follow the review methodologies
embraced in previous scholarly work (Clark et al. 2007;
Leidner and Kayworth 2006). Almost all empirical research
associated with privacy has attempted to address one or more
of these levels of analysis, but it will be seen that very little
attention has been paid to group-level analysis.5
Appendix A describes the methodology used to identify
general privacy publications. Tables B1, B2, and B3 provide
summaries of the purely descriptive works, normative works,
4
Philosophers seldom distinguish between our categories of “purely descriptive” and “empirically descriptive” treatments. We note the distinction
because it is important in IS and social science research.
5
Although references to “groups” can be found in some IS analyses (e.g.,
Leidner and Kayworth 2006), a clear definition of a group (as opposed to an
organization or a subunit) is seldom included. We embrace the view
suggested by Propp and Kreps (1994) that members of a group are
(1) pursuing interdependent goals, (2) have the ability to be aware of and
react to each other, and (3) perceive themselves collectively to be a group.
The specific size of such a group is subject to debate, although “traditionally,
the majority of research on groups has focused on small groups of three to
seven members” (Propp and Kreps 1994, p. 10). As will be discussed later,
a clarification of definitional boundaries regarding groups should be a
component of future research into online groups.
992
MIS Quarterly Vol. 35 No. 4/December 2011
and empirically descriptive studies, respectively. Our
analysis reveals that the works addressed to privacy have
almost all been associated, in one way or another, with
attempts to answer one of three major questions.
(1) What is (and is not) privacy? There is no single concept
of privacy that crosses all disciplines and that is
embraced by all observers. There are ongoing debates
regarding the distinction, if any, between privacy and
related constructs such as security.
(2) What is the relationship between privacy and other
constructs? There are competing research assumptions
and paradigms associated with empirical research into
various nomological models that include privacy.
(3) To what extent does context matter in the relationships
between privacy and other constructs? There is
disagreement regarding the extent to which these relationships can be generalized across contexts, such as
types of information, different industries, and new technological applications.
In the following sections, we discuss each of these major
questions and provide a summary of the attempts to grapple
with them. At the end of each section, we reflect on the work
associated with that question. We summarize our reflections
in our “Future Research and Conclusion” section.
Question #1: What Is
(and Is Not) Privacy?
General privacy as a philosophical, psychological, sociological, and legal concept has been researched for more than
100 years in almost all spheres of the social sciences. And
yet, it is widely recognized that, as a concept, privacy “is in
disarray [and n]obody can articulate what it means” (Solove
2006, p. 477). Numerous attempts have been made by social
and legal scholars to bring together the different perspectives
found in different fields. However, the picture that emerges
is fragmented with concepts, definitions, and relationships
that are inconsistent and neither fully developed nor empirically validated.
Figure 1 and Table B4 summarize the approaches to defining
general privacy that can be found in various disciplines (corresponding to numbers of articles found in each discipline).
The definitional approaches can be broadly classified as either
value-based or cognate-based. The value-based definition
views general privacy as a human right integral to society’s
Smith et al./Information Privacy Research
Figure 1. Approaches to Defining General Privacy (Refer to Table B4 for more details)
moral value system. Historically, this was the first definition
of general privacy. Subsequent works defined the boundaries
between public and private. The evolution of IT complicated
the debate about the boundaries—a process seen by some
scholars as pervasive dissolution of the boundary (Marx 2001;
Rosen 2000) and by others as lack of rigor in defining these
boundaries in the first place (Nissenbaum 1998).
Furthermore, when the general privacy as right concept was
applied to consumer behavior, a privacy paradox was noted:
despite reported high privacy concerns, consumers still
readily submit their personal information in a number of
circumstances. Thus, the notion of privacy as a commodity
was conceptualized (Bennett 1995). Under the commodity
view, privacy is still an individual and societal value, but it is
not absolute, as it can be assigned an economic value and be
considered in a cost–benefit calculation at both individual and
societal levels.
In his attempt to supply a more rigorous definition that can be
used in empirical research, Westin (1967) introduced the
notion of state in the general privacy concept: “voluntary and
temporary withdrawal of a person from the general society”
(p. 7). Psychologists and cognitive scientists then became
interested in producing a cognate-based conceptualization of
general privacy–related to the individual’s mind, perceptions,
MIS Quarterly Vol. 35 No. 4/December 2011
993
Smith et al./Information Privacy Research
and cognition rather than to an absolute moral value or norm.
Since the state of withdrawal rests within the physical or
information space, scholars argued that general privacy was
about control of physical space and information. We now
discuss the major definitional streams from each category in
more detail.
Value-Based Definitions
General Privacy as a Right
There is a substantive debate regarding general privacy’s
status as a human right: If such a right exists, how did it
emerge? How is it justified philosophically (Schoeman
1984)? Who is responsible for protecting it (Milberg et al
(2000)? This view of general privacy is fundamentally
normative, and some scholars (e.g., Posner 1984) claim that
it may be at odds with the legal and societal frameworks of
various cultures and thus cannot be treated absolutely.
The evolution of this debate is obvious when one considers
the roots of general privacy as a right in legal and political
theories. For example, in the United States, common law did
not recognize any right to general privacy, general privacy is
not spelled out in the U.S. Constitution, and the courts did not
consider general privacy as a protected right until the 20th
century. Although divergent from the traditional British
perspective (Richards and Solove 2007), general privacy has
generally been viewed as a developing right in U.S. law, with
the U.S. derivation usually traced to Warren and Brandeis’s
(1890) article in Harvard Law Review, in which they defined
general privacy as “the right to be left alone.” This “general
privacy as a right” perspective has since influenced numerous
opinions and has been given constitutional sanction by the
U.S. Supreme Court (Breckenridge 1970). Common themes
of relevant court cases include general privacy and law
enforcement (searches and seizures); general privacy and self
(abortions and embryos); privacy and the press (private facts
exposure, celebrity culture, intrusion); privacy and the voyeur
(sex tapes, peephole); and privacy in the workplace (psychological testing, lifestyle monitoring) (Alderman and Kennedy
1997).
Two major issues arose with these court cases: 1) the need to
define general privacy more specifically than the “right to be
left alone” and (2) the state as the protector of general
privacy. These two issues made the general privacy debate
among legal and political scholars necessarily and unavoidably ideological. Regarding the first issue, courts have
stopped seeking a definition of general privacy following the
Younger Committee Report (1972), which concluded that
general privacy could not be satisfactorily defined.
994
MIS Quarterly Vol. 35 No. 4/December 2011
Regarding the second issue, two major camps of scholars
argue about the role of the state in protecting individual
general privacy and thus argue for or against the need for
regulation of general privacy. The rallying point in the “for”
argument among political theorists is the role of the state as
the guarantor of individual general privacy (Rosen 2000).
Ironically, the same principles of liberalism that undergird this
argument also serve as the rallying point of the “against”
libertarian camp of privacy protection. The against camp
points to the market-based economic perspective of privacy.
According to this view (to which we now turn), privacy is
inherently an economic commodity and should be treated as
such.
Privacy as a Commodity
Some libertarian political scientists argue that a call for
greater privacy is, fundamentally, antagonistic to the political
economy of the information markets (Bennett 1995; Cohen
2001). In their view, privacy is not an absolute right but is
subject to the economic principles of cost–benefit analysis
and trade-off. From this observation, a stream of treatment
from the privacy as commodity perspective has arisen
(Campbell and Carlson 2002; Davies 1997).
To explain the phenomenon of voluntarily providing information online (so-called self-surveillance) social scientists recognize the economic component of privacy: individuals cooperate in the online gathering of data about themselves as economic subjects. This participation in surveillance is possible
because of recent reconceptualization of privacy in the
consumer’s mind from a right or civil liberty to a commodity
that can be exchanged for perceived benefits (Campbell and
Carlson 2002; Davies 1997; Garfinkel 2000).
Since libertarians have always treated privacy as a commodity, the salient question revolves around the extent to
which the above observed commodification of privacy is a
measurable result of a real individual shift and not a paradigm
of scholarly shift. For example, Laudon (1996) has argued
that the current crisis in the privacy of personal information
is a result of market failure and calls for market correction
through information technologies with privacy-enhancing
mechanisms.
The distinction between privacy as a right and a commodity,
although obviously undergirded by normative, value-laden
assumptions, becomes important in empirical research. As
will be seen in a later section in which we discuss the
relationship between privacy and other constructs, many
researchers unconsciously embrace one definition or the other
without noting the value-laden assumptions.
Smith et al./Information Privacy Research
Cognate-Based Definitions
General Privacy as a State
The general privacy as a state concept was introduced by
Westin (1967), who defined privacy through four distinct
substates: anonymity, solitude, reserve, and intimacy. Later,
Schoeman (1984, p. 3) defined general privacy as “a state of
limited access to a person.” Weinstein (1971, p. 626) defined
general privacy as a state of “being apart from others.” He
drew a parallel between general privacy and alienation, loneliness, ostracism, and isolation and noted that, among those,
only general privacy is sought after whereas the others are
avoided by individuals and are regarded by society as
punitive. Further, Laufer and Wolfe (1977) conceptualized
general privacy as a situational concept (state) tied to concrete
situations with three dimensions: self-ego, environmental, and
interpersonal. Information systems, economics, and marketing scholars narrowed these definitions of general privacy
so that they addressed information-based issues (see Table
B4). The state of limited access was translated to state of
limited access to information.
When privacy is viewed as a state, it is natural for researchers
to consider it in terms of its role as a sought-after goal (i.e., an
individual’s desire to exist in a state of privacy). The implication is that there must be a continuum of states of privacy,
from absolute to minimal.
General Privacy as Control
The concept of general privacy as control originated in
Westin’s (1967) and Altman’s (1975) theories of general
privacy. Altman’s definition of general privacy is “the selective control of access to the self” (p. 24). Margulis (1977a,
1977b) unified and elaborated on Westin’s and Altman’s
perspectives and proposed a control-centered general privacy
definition: “Privacy, as a whole or in part, represents the
control of transactions between person(s) and other(s), the
ultimate aim of which is to enhance autonomy and/or to
minimize vulnerability” (1977a, p. 10). The control-based
definition has since entered the mainstream of privacy
research—likely because it lends itself more readily to the
attributes of information privacy—and has been further
developed in the fields of information systems and marketing
(Altman 1975; Culnan 1993; Kelvin 1973; Margulis 1977a;
Smith et al. 1996; Westin 1967). It is worth noting that the
original definition equates privacy with control, per se, while
most evolving definitions equate privacy with ability to
control.
More frequently than not, the element of control is embedded
in most conceptual arguments and definitions of privacy and
has been used to operationalize privacy in numerous measurement instruments (Altman 1975; Culnan 1993; Kelvin
1973; Margulis 1977a; Smith et al. 1996; Westin 1967).
Scholars have linked the concept of general privacy with control by either defining general privacy as control, per se, or by
positioning control as a key factor shaping privacy. Johnson
(1974), for instance, defined general privacy as “secondary
control in the service of need-satisfying outcome effectance”
(p. 91).
However, many researchers from several disciplines,
including information systems, reason that control is actually
one of the factors that shape general privacy and that general
privacy is not control per se (Laufer and Wolfe 1977;
Margulis 2003a, 2003b). For instance, Laufer and Wolfe
(1977) conceptualized control as a mediating variable in the
general privacy system by arguing that “a situation is not
necessarily a [general] privacy situation simply because the
individual perceives, experiences, or exercises control” (p.
26). Conversely, the individual may not perceive (s)he has
control, yet the environmental and interpersonal elements may
create perceptions of general privacy (Laufer and Wolfe
1977). As Margulis (2003a, 2003b) pointed out, there have
been very few theoretical attempts to clarify the nature of
control and to explicate this control: privacy contention in the
privacy literature (exceptions are Dinev and Hart 2004;
Johnson 1974; Xu 2007).6
What Privacy Is Not
Privacy has been described as multidimensional, elastic, and
dynamic in the sense that it varies with life experience
(Altman 1977; Laufer and Wolfe 1977). Overlapping
concepts such as confidentiality, secrecy, anonymity, security,
and ethics have added to the confusion (Margulis 2003a,
2003b). Therefore, Question #1 also asks what privacy is not.
Although there is confusion regarding the boundaries around
the construct, much of the murkiness can be stripped away by
a careful consideration of the distinctions.
6
Privacy is frequently defined in IS and many branches of social science
research in phrases such as “the ability of individuals to control the terms
under which their personal information is acquired and used” (Culnan and
Bies 2003, p. 326), which is adapted from Westin’s (1967) definition. Such
a definition is nonnormative, because it assumes no right to privacy, and it
also implies no specific tradeoff with other commodities. This definition
does not rely on any assumptions about a state of being, which reduces the
ambiguity inherent in empirical assessment thereof. This definition may be
less useful for philosophical discourse, although such discourse is seldom
associated with empirical research.
MIS Quarterly Vol. 35 No. 4/December 2011
995
Smith et al./Information Privacy Research
Anonymity: Anonymity is the ability to conceal a person’s
identity (Camp 1999; Marx 1999; Qian and Scott 2007;
Rensel et al. 2006; Zwick and Dholakia 2004), which is
central for the information collected for statistical purposes.
In the IT context, anonymity is often shaped by the features
of privacy-enhancing technologies. For example, anonymizers allow an individual to browse Web sites with a high
degree of anonymity, as cookies cannot be placed on the
user’s browser and the IP addresses cannot be tracked (Waldo
et al. 2007). Anonymity is not dichotomous, in that it varies
in degrees (Kobsa and Schreck 2003; Nissenbaum 1999; Qian
and Scott 2007): individuals can choose to be totally anonymous, pseudonymous, or identifiable.
There has been much discussion as to what role anonymity
plays in privacy. Although these two concepts interrelate,
anonymity is not privacy (Camp 1999). Anonymity exists
when someone is acting in a way that limits the availability of
identifiers to others. Since the information cannot be correlated back to the individual, this may enable privacy control.
However, many other avenues to such control also exist.
Secrecy: Secrecy has been defined as intentional concealment
of information (Bok 1989; Tefft 1980), and it usually
expresses a disposition toward the sharing of potentially
inaccurate information (Zwick and Dholakia 2004). Secretive
withholding of personal information is then regarded as an
attempt to block any digital representation from emerging in
the network. Although secrecy is easily distinguishable from
privacy (Hirshleifer 1980), they are often mistaken and
confused with each other (McLean 1995). “Privacy need not
hide; and secrecy hides far more than what is private” (Bok
1989, p. 11). Warren and Laslett (1977) also reflect on the
conceptual comparison between privacy and secrecy. According to them, secrecy implies the concealment of something that is negatively valued by the excluded audience;
privacy, by contrast, protects behavior which is either morally
neutral or valued by society. Secrecy enables individuals to
manipulate and control environments by denying outsiders
vital information about themselves (Tefft 1980).
Confidentiality: Richards and Solove (2007) suggest that
while the American derivation of general privacy is grounded
in one’s inviolate personality (focus on individualism), British
law instead embraces a conception of privacy as confidentiality. With the advent of extensive information exchange,
confidentiality concerns the externalization of restricted but
accurate information to a specific entity (Zwick and Dholakia
2004). The distinction between privacy and confidentiality is
well discussed in the literature (Camp 1999; Rindfleisch
1997). Privacy corresponds to the desire of a person to control
the disclosure of personal information; confidentiality corresponds to the controlled release of personal information to
996
MIS Quarterly Vol. 35 No. 4/December 2011
an information custodian under an agreement that limits the
extent and conditions under which that information may be
used or released further.
Building on Zwick and Dholakia’s (2004) conceptualization
of identity management, we can differentiate anonymity,
secrecy, and confidentiality based on the externalization of
personal information. Figure 2 illustrates how the various
constructs are related based on the digital representation of an
individual, which is determined by the amount and accuracy
of the personal information collected. Of course, none of the
constructs in Figure 2 is equivalent to privacy itself.
Security: The perceptions or concerns of security by users of
electronic systems has been addressed in IS research (Benson
1983; Goodhue and Straub 1991; White and Christy 1987)
but, as has been pointed out by Belanger et al. (2002), there
is a lack of understanding of how privacy and security issues
are related. Security corresponds to the concerns about the
protection of personal information with three specific goals
(Belanger et al. 2002; Camp 1999; Chellappa 2008): integrity
that assures information is not altered during transit and
storage; authentication that addresses the verification of a
user’s identity and eligibility to data access; and confidentiality that requires data use is confined to authorized purposes by authorized people. As Culnan and Williams (2009)
argue, organizations can successfully secure the stored
personal information but still make bad decisions about the
subsequent use of personal information, resulting in information privacy problems. Therefore, as Ackerman (2004)
suggested, “security is necessary for privacy, but security is
not sufficient to safeguard against subsequent use, to minimize the risk of…disclosure, or to reassure users” (p. 432).
Ethics : As mentioned above, general privacy has consistently been viewed as an ethical issue across various disciplines. While there are ethical dimensions associated with
general privacy (Ashworth and Free 2006; Caudill and
Murphy 2000; Culnan and Williams 2009; Foxman and
Kilcoyne 1993), general privacy is not equivalent to ethics.
General privacy has been examined in the literature from a
number of ethical theoretical perspectives including social
contract theory, duty-based theory, stakeholder theory, virtue
ethics theory, and the power–responsibility equilibrium model
(for a review, see Caudill and Murphy 2000). Even though
philosophical argumentation may imply some normative
ethical obligations to protect or to acknowledge privacy, it is
incorrect to equate privacy with ethics. In fact, one can easily
engage in empirical privacy-related research without ever
considering ethical aspects of the construct.
In that light, having now considered what privacy is (and is
not), we turn to a consideration of its empirical treatment
across various disciplines.
Amount of Personal Information Externalized
High
Smith et al./Information Privacy Research
1
4
TRANSPARENCY
ANONYMITY
Disclose large amount of
personal information
Sharing of personal information
while concealing a consumer’s
real identity.
Reveal an accurate
representation of the self
2
3
SECRECY
Sharing of little and potentially
inaccurate information
Externalization of restricted but
highly accurate information
Low
Avoid digital representations of
the real self
CONFIDENTIALITY
Low
Accuracy of Personal Information
High
Figure 2. Tactics of Identity Management (Adapted from Zwick and Dholakia 2004)
Question #2: What Is the Relationship
Between Privacy and Other Constructs?
The relationship between privacy and other constructs has
been examined by a number of empirically descriptive (and
primarily positivist) studies that cut across several disciplines
such as marketing, IS, and organizational behavior (see
Figure 3).
We first examine the central construct in this figure (“Privacy
Concerns”); we then consider the antecedents (on the left),
and we then focus on the outcomes and associated
relationships (on the right). We refer to this macro model as
“Antecedents  Privacy Concerns  Outcomes” (APCO).
Privacy Concerns as a Measurable
Proxy for Privacy
Because of the near impossibility of measuring privacy itself,
and also because the salient relationships depend more on
cognitions and perceptions than on rational assessments,
almost all empirical privacy research in the social sciences
relies on measurement of a privacy-related proxy of some
sort. Although the proxies sometimes travel with monikers
such as beliefs, attitudes, and perceptions, over time, especially within IS research, there has been a movement toward
the measurement of privacy concerns as the central construct.
Several studies have operationalized privacy concerns in
detail: The concern for information privacy (CFIP) scale was
developed by Smith et al. (1996), who identified four datarelated dimensions of privacy concerns (collection, errors,
secondary use, and unauthorized access to information).
These dimensions were later revalidated by Stewart and
Segars (2002). These dimensions have since served as some
of the most reliable scales for measuring individuals’ concerns toward organizational privacy practices. More recently,
Malhotra et al. (2004) operationalized a multidimensional
scale of Internet users’ information privacy concerns (IUIPC),
which adapted the CFIP into the Internet context.
It should be noted that these privacy concerns are almost
always measured at an individual level of analysis, so most of
the research on the left side of Figure 3 has an individual unit
as its dependent variable (DV). However, not all of the inde-
MIS Quarterly Vol. 35 No. 4/December 2011
997
Smith et al./Information Privacy Research
Privacy
experiences
Regulation
Privacy
awareness
*
Personality
differences
Privacy Concerns
(e.g., beliefs, attitudes,
perceptions)
Behavioral reactions
(including disclosures)
*
Trust
*
Privacy notice/
seal
Demographic
differences
Privacy calculus
Culture/
climate
Risks/Costs
Benefits
Dotted lines indicate that the relationship is tenuous (i.e., has not been confirmed through repeated studies).
Not shown: Possible two-way loop, in which some actions on the right may impact some constructs on the left.
*Results threatened by privacy paradox, since usually intentions (not behaviors) have been measured.
Figure 3. Relationships Between Privacy and Other Constructs: Antecedents  Privacy Concerns 
Outcomes (APCO Macro Model)
pendent variables (IVs) are measured at the individual level,
as some (e.g., culture/climate) are at the organizational or
national level.
Privacy Concerns as Dependent Variable (DV)
Although relationships between a number of antecedents and
privacy concerns have been investigated, these studies have
usually been conducted in a somewhat disjointed manner and
with only minimal replication. A smaller body of research
has systematically emerged on the left-hand side of Figure 3.
Drawing on information boundary theory, Xu et al. (2008)
developed an integrative model suggesting that privacy
concerns form because of an individual’s personal characteristics or situational cues that enable one person to assess
the consequences of information disclosure.
With respect to privacy experiences, Smith et al. (1996) found
that individuals who have been exposed to or been the victim
of personal information abuses should have stronger concerns
regarding information privacy. Privacy awareness reflects
the extent to which an individual is informed about
organizational privacy practices (Malhotra et al. 2004; Phelps
et al. 2000). Research suggests that consumers’ privacy
998
MIS Quarterly Vol. 35 No. 4/December 2011
concerns are triggered when consumers become aware that
organizations have collected and/or used their personal
information without their permission (Cespedes and Smith
1993). Consumers who are unaware of name removal
procedures tend to be less likely to be concerned about
privacy than consumers who are aware of name removal
procedures (Culnan 1995). It has been found that consumers
tend to be less concerned about their privacy when firms seek
permission to collect and use their information (Nowak and
Phelps 1995).
Personality differences such as introversion versus
extroversion (Lu et al. 2004), independent-self versus
interdependent-self (Xu 2007), and “big-five” personality
traits7 (Bansal et al. 2010) have been found to affect
individual privacy concerns. Dinev and Hart (2006) found
that social awareness is a predictor of privacy concerns.
Individuals with high social awareness are aware of privacy
policies and follow privacy issue developments. Various
studies have also investigated how demographic differences
7
Bansal et al. (2010) examined the role of the “big five” personality traits in
influencing individuals’ perceptions of health information sensitivity. These
five dimensions of personality are extroversion, agreeableness, emotional
instability, conscientiousness, and intellect.
Smith et al./Information Privacy Research
Figure 4. Studies Related to Cross-Cultural Contexts (Refer to Table B5 for more details)
affect the degree of stated privacy concerns (Chen and Rea
2004; Culnan and Armstrong 1999; Sheehan 1999; Sheehan
and Hoy 2000). For example, women have been found to be
generally more concerned than men about the impact of
information collection on their privacy (Sheehan 1999). Also,
it was found that those consumers who were less likely to be
concerned about privacy were more likely to be young, poor,
less educated, and African-American (Culnan 1995).
Cross-cultural antecedents have also been considered as
independent variables (see Figure 4 and Table B), most often
at the individual and the societal levels. For example, Dinev
et al (2006a, 2006b) showed that Italian society has a different
concept of privacy that leads to lower privacy concerns but
also to higher perceived risk. These differences inform
weaker relationships between institutional trust and ecommerce use, privacy concerns and e-commerce use, and
perceived risk and institutional trust, and a stronger
relationship between perceived risk and privacy concerns.
Privacy Concerns as Independent Variable (IV)
In contrast to the limited empirical attention that has been
focused on relationships between antecedents and privacy
concerns, a larger body of research has emerged on the righthand side of APCO model in Figure 3: that is, consideration
of privacy concerns as an IV in which other outcomes are
viewed as the DVs.
Behavioral reactions. The most prominent DVs are those
associated with behavioral reactions to privacy concerns, with
the most visible reactions being individuals’ willingness to
disclose information and/or to engage in commerce. Some
researchers have viewed trust as a mediating variable between
privacy concerns and disclosure itself (Metzger 2004; Xu et
al. 2005), but in many other studies trust has been shown to be
antecedent to privacy (e.g., Belanger et al. 2002), outcome of
MIS Quarterly Vol. 35 No. 4/December 2011
999
Smith et al./Information Privacy Research
privacy (e.g., Bansal et al. 2010; Chellappa 2008; Malhotra et
al. 2004), or a moderator/mitigator (e.g., Bansal et al. 2008)
of the influence of privacy concerns on individual behavioral
reactions. In one study of 477 U.S. households, researchers
found that privacy concerns had a significant impact on online
purchase intent, with the greatest negative impact being
through its relationship with trust (Eastlick et al. 2006). Firms
that are positioned as “safer” or “trustworthy” on the privacy
dimension will likely have a competitive advantage (Bowie
and Jamal 2006). It has been found that consumers who trust
the firm are less concerned about their privacy and more
willing to provide personal information (Schoenbachler and
Gordon 2002). In a study examining the efficacy of
improving trust and reducing concern on managing consumer
information, Milne and Boza (1999) showed that building
trust is more effective than trying to reduce consumer
concern.
demographic, etc.), the level of actual disclosure significantly
exceeded individuals’ intentions to disclose information.
Research has found that firms can build trust, and thus
mitigate privacy fears, by exhibiting procedural justice
through the implementation of fair information practices
(Culnan and Armstrong 1999; Xu et al. 2010), explicit
communication of a privacy policy (Andrade et al. 2002;
Eastlick et al. 2006; Milne and Boza 1999), adoption of P3P
in their privacy policy development (Xu et al. 2005), and/or
display of privacy notices and/or seals of approval (LaRose
and Rifon 2006; Wang et al. 2004). The presence of privacy
seals has been found to have a positive effect on the
perception of trust in a website (Rifon et al. 2005) and
resulted in more favorable perceptions toward the privacy
statement (Miyazaki and Krishnamurthy 2002).8
Although Figure 3 includes the construct “behavioral
reactions” on its right-hand side, in reality it is quite common
for researchers to measure stated intentions instead of actual
behaviors, inferring through references to the theory of
reasoned action (TRA) (Fishbein and Ajzen 1975), that
behaviors will match actual intentions. To the extent that the
privacy paradox holds, however, this appeal to TRA may be
misguided. In fact, associations between privacy concerns
and stated intentions may not be reflective of actual
behaviors. We have noted this exposure in Figure 3 with
asterisks.
Privacy paradox. Recent surveys, anecdotal evidence, and
experiments have highlighted the privacy paradox:
individuals state privacy concerns but behave in ways that
seemingly contradict their statements (Acquisti 2004;
Acquisti and Grossklags 2005a; Jensen et al. 2005; Norberg
et al. 2007). For example, Sheehan and Hoy (1999) found
that stated privacy concern correlates negatively with the
reported frequency of registering with websites in the past and
positively with providing incomplete information during
registration. Norberg et al. (2007) found that, for all information categories (personally identifying, financial, preferences,
8
However, a number of recent studies uncovered weak consumer trust toward
third-party certification agencies. Several studies came to the conclusion that
websites that decide to “pay up” for certain privacy seals seem to have more
questionable privacy practices than ones that do not (Edelman 2011). LaRose
and Rifon (2006) found that sealed sites requested significantly more
personal information from users than unsealed sites. Miyazaki and Krishnamurthy (2002) reviewed 60 high-traffic websites and found no support for the
hypothesis that participation in a seal program is an indicator of better
privacy practices (LaRose and Rifon made similar findings).
1000
MIS Quarterly Vol. 35 No. 4/December 2011
A plausible explanation provided by Acquisti’s work for such
a privacy attitude/behavior dichotomy is that users’ privacy
decision processes are affected by bounded rationality
(Acquisti 2004; Acquisti and Grossklags 2005b). The economics literature suggests that individuals have a tendency to
discount “hyperbolically” future costs or benefits
(O’Donoghue and Rabin 2001; Rabin and O’Donoghue
2000). Such hyperbolic discounting implies inconsistency of
personal preference over time: future events may be
discounted at different discount rates than near-term events
(Acquisti 2004); the benefits of disclosing personal information may be immediate (e.g., convenience of placing orders
online), but the risk of such information disclosure may be
invisible or spread over time (e.g., identity theft).
Regulation. In addition to behavioral reactions associated
with disclosure and commerce, Milberg et al. (2000) suggest
that if consumers do not perceive firms as adequately
protecting their privacy, they will distrust self-regulation and
prefer state intervention, which can eventually lead to a
regulatory response. A limited stream of research (see,
especially, Jentzsch 2001; Smith 2001) has investigated the
dissonance between U.S. and European privacy laws, which
is related to the conflict between viewing privacy as a right
versus a commodity. Jentzsch (2001) has noted that Europe
assigns more property rights to consumers than does the
United States, and financial privacy is more strictly regulated
in Europe than in the U.S., although the U.S. seems to be
converging more on the European model.
At the societal level, several studies pointed out that human
rights societies long approached privacy in an omnibus
fashion by passing sweeping privacy bills that address all the
instances of data collection, use, and sharing (Bennett and
Raab 1997; Dholakia and Zwick 2001; Smith 2001). Some
examples of countries in this category include Australia,
Canada, New Zealand, and countries in the European Union
Smith et al./Information Privacy Research
(Smith 2004). Assigning fundamental rights to personal
information would result largely in an opt-in market for
information sharing, whereby firms would have access to the
information only of those consumers who chose to make it
available (Smith 2001). In contrast, in commodity societies,
there are no omnibus laws governing collection, use, and
sharing of personal information that transcend all types of
data in all sectors of the economy (Smith 2001). Some
countries in this category have a patchwork of sector-specific
privacy laws that apply to certain forms of data or specific
industry sectors (Bennett and Raab 1997; Dholakia and Zwick
2001; Smith 2001). For instance, in the United States, there
are sector-specific laws for specific types of records such as
credit reports and video rental records and for classes of
sensitive information such as health information (Smith
2004). The commodity societies largely see opt-in as an
undue burden.
The debate between the right and commodity views of privacy
highlights another controversial issue in the privacy literature:
the relative effectiveness of industry self-regulation versus
government legislation in ensuring consumer privacy. Skepticism about the effectiveness of industry self-regulation in
protecting consumer privacy (Edelman 2011; Hui et al. 2007)
has resulted in privacy advocates and consumers clamoring
for strong and effective legislation to curtail rampant abuses
of information by firms. At the societal level, Tang et al.
(2008) indicate that although overarching government
regulations can enhance consumer trust, regulation may not be
socially optimal in all environments because of lower profit
margins for firms and higher prices for consumers. Bellman
et al. (2004) found that participants from countries with
omnibus privacy regulation had greater desire for more
regulation than participants from countries with sectoral
regulation, while participants from countries with sectoral
privacy regulation had less desire for more regulation than
participants from countries with no privacy regulation.
Privacy Calculus
A subset of empirical studies—often somewhat disconnected
from the other constructs within Figure 3—addresses the
concept of privacy calculus by assuming that a
consequentialist tradeoff of costs and benefits is salient in
determining an individual’s behavioral reactions. This
perspective is found in various works (e.g., Klopfer and
Rubenstein 1977; Laufer and Wolfe 1977; Posner 1981; Stone
and Stone 1990) that view the concept of privacy as not
absolute but, rather, subject to interpretation in “economic
terms” (Klopfer and Rubenstein 1977, p. 64). Such a calculus
perspective of privacy suggests that, when requested to pro-
vide personal information to corporations, consumers would
perform a risk–benefit analysis to assess the outcomes they
would face in return for the information, and respond
accordingly (Chellappa and Sin 2005; Culnan 1993; Dinev
and Hart 2006; Hann et al. 2008; Hui et al. 2006; Milne and
Gordon 1993; Milne and Rohm 2000; Xu et al. 2010). We
consider these risks and benefits.
Privacy risk has been defined as the degree to which an
individual believes that a high potential for loss is associated
with the release of personal information to a firm (Featherman
and Pavlou 2003; Malhotra et al. 2004). In the privacy literature, although privacy concerns have often been treated as a
multidimensional construct (Malhotra et al. 2004; Smith et al.
1996), privacy risk has been treated as a single-dimensional
construct that measures potential loss of control over personal
information (e.g., Dinev and Hart 2006). Prior privacy literature has identified sources of organizational opportunistic
behavior, including insider disclosure or unauthorized access
and theft (Rindfleisch 1997), and selling personal data to, or
sharing information with, third parties, financial institutions
(Budnitz 1998), or government agencies (Preston 2004; Wald
2004). An individual’s calculation of risk involves an assessment of the likelihood of negative consequences as well as the
perceived severity of those consequences (Peter and Tarpey
1975). The negative perceptions related to risk may affect an
individual emotionally, materially, and physically (Moon
2000). A number of e-commerce studies empirically verified
the negative effect of perceived risk on intentions to conduct
transactions (Budnitz 1998; Jarvenpaa and Leidner 1999;
Jarvenpaa and Tiller 1999; Jarvenpaa et al. 2000; Norberg and
Horne 2007; Norberg et al. 2007; Pavlou 2003; Pavlou and
Gefen 2004). Additionally, previous studies generally supported the positive impacts of privacy risk on privacy
concerns (Dinev et al. 2006b; Dinev and Hart 2004), and the
negative impacts of privacy risk on intention to disclose
personal information (Malhotra et al. 2004; Xu et al. 2005).
Privacy benefit. Following the notion of privacy calculus,
“individuals are assumed to behave in ways that they believe
will result in the most favorable net level of outcomes” (Stone
and Stone 1990, p. 363). Scholars have identified three major
components of benefits of information disclosure including
financial rewards, personalization, and social adjustment
benefits. Recent privacy studies provide empirical evidence
that compensating consumers through financial rewards can
foster their information disclosure (Caudill and Murphy 2000;
Hann et al. 2008; Phelps et al. 2000; Xu et al. 2010). In terms
of the value of personalization, Chellappa and Sin (2005)
found that it can override privacy concerns: “the consumers’
value for personalization is almost two times…more influential than the consumers’ concerns for privacy in deter-
MIS Quarterly Vol. 35 No. 4/December 2011
1001
Smith et al./Information Privacy Research
mining usage of personalization services” (p. 197). A study
by White (2004) also confirmed that users are more likely to
provide personal information when they receive personalization benefits. A study by Lu et al. (2004) demonstrated that
social adjustment benefits (defined as the establishment of
social identity by integrating into desired social groups) can
also have an effect on intended disclosure behavior.
Looking Across the Empirical Research
In looking across the empirically descriptive (and primarily
positivist) privacy research to date, it can be observed that the
unit of analysis varies little within these studies; in particular,
most studies from the privacy as a commodity perspective
(including privacy calculus and privacy paradox) have considered only the individual unit of analysis (with Dinev et al.
(2006b) an exception that addressed the societal level of
analysis). No studies have considered the group level of
analysis, and only a very few (represented in the sample by
Walczuch and Steeghs 2001) have addressed the organizational unit of analysis in the context of these perspectives.
It will also be recalled, from our earlier discussion of Figure
3, that research attention to the overall antecedents  privacy
concerns  outcomes (APCO) rubric has been distributed
non-uniformly. In particular, almost all of the research focus
has been associated with the linkage between privacy concerns and outcomes (usually at the organizational or societal
level), with very little attention having been paid to the
linkage between antecedents and privacy concerns (and what
has been attempted has usually been at the individual level of
analysis). Further, the theoretical and empirical connections
between the APCO rubric and the privacy calculus stream are
rather muted, which causes frustration for observers who wish
to view all of the research findings in a cohesive framework.
It is rather clear, then, that the empirical research stream has
to date been somewhat constrained. And, as will be seen as
we consider Question #3, the picture is even further muddied
when we consider the matter of context.
Question #3: To What Extent Does Context Matter in the Relationships between
Privacy and Other Constructs?
Contextual Nature of Privacy
In our examination of Question #2, we proceeded as though
privacy was a construct that could be considered as a singular
1002
MIS Quarterly Vol. 35 No. 4/December 2011
entity. Yet, many legal and social scholars (Altman 1975,
1977; Hirshleifer 1980; Laufer and Wolfe 1977; Lederer et al.
2004; Malhotra et al. 2004; Margulis 1977a, 2003a; Solove
2004, 2006, 2008; Waldo et al. 2007; Westin 1967, 2001,
2003) believe that general privacy—its conceptual understanding, rigorous definition, and the intensity of the
individual and cultural beliefs it informs—is so dependent on
the specific context that it is impossible to develop a one-sizefits-all conceptualization of general privacy. Scholars have
tended to conclude that it can mean different things to
different individuals and, hence, the roots and consequences
of its violation are also contextual (Bennett 1992).
Acquisti (2004) holds that privacy should be regarded more
as a class of multifaceted interests than as a single,
unambiguous concept, and its value may be discussed only
once its context has also been specified. Recently, Bansal et
al. (2008) provided a review of the general definitions and
meaning of context. Context has been defined as “stimuli and
phenomena that surround and, thus, exist in the environment
external to the individual, most often at a different level of
analysis” (Mowday and Sutton 1993, p. 198). Context could
be related to the type or domain of the research construct
(discipline), time (when), location (where), occupation (who),
culture (with whom), and rationale (why) (Bansal et al. 2008).
For example, the industry to which an organization belongs
can be viewed as a contextual variable; longitudinal studies
can be viewed as contextual studies with time as the
contextual variable (when); comparisons of behavior and
beliefs across occupation, gender, race, income, culture, etc.
can be viewed as contextual studies, demographic being the
contextual variables (who); comparing privacy concerns
across different types of information or web sites with
different levels of information exchange renders the “type of
information” or “information sensitivity” (Malhotra et al.
2004) as contextual variables (see also Bansal et al. 2008).
Individuals are continually engaging in an adjustment process
in which desires for privacy are weighed against desires for
disclosure and personal communication with others (Kimmel
1996; Sheehan 2002). The adjustment occurs in the context
of various situational forces, such as pressures from others,
societal and political norms, and processes of surveillance
used to enforce them (Kimmel 1996). Privacy either depends
on or derives from the nature of its threats (Regan 1995;
Sheehan 2002). Indeed, context has been found to moderate
(Bellman et al. 2004; Dinev et al. 2006a, 2006b; Milberg et al.
1995; Smith et al. 1996) or to directly influence (Malhotra et
al. 2004) the nature and extent of privacy-related relationships
(see also Pedersen 1997, 1999).
Smith et al./Information Privacy Research
Figure 5. Studies Based on Type of Information (Refer to Table B6 for more details)
In our analysis of the privacy research, we found that the most
often cited contexts for privacy and privacy-related beliefs are
(1) the type of information collected from individuals (e.g.,
behavioral, financial, medical, biometric, consumer, biographical); (2) the use of information by sector (e.g., healthcare, marketing, and finance); (3) political context (e.g. law
enforcement, constitutional rights of self, government, pubic
data and media); and (4) technological applications.
The first context, the types of information collected, is sometimes referred to as contextual sensitivity or information
sensitivity (Malhotra et al. 2004): consumers’ beliefs and
behavioral responses to privacy threats depend on the type of
information requested (Dinev and Hart 2007; Milne and
Gordon 1993; Phelps et al. 2000; Sheehan and Hoy 2000). In
general, information about lifestyle characteristics and shopping habits and preferences are considered less sensitive than
medical and financial information (Nowak and Phelps 1992;
Phelps et al. 2000; Sheehan and Hoy 2000). Malhotra et al.
(2004) found a statistically significant direct effect of information sensitivity on trusting beliefs, risk beliefs, and
behavioral intention. Xu et al. (2008) confirmed that privacyrelated relationships do vary across types of web sites (ecommerce, social networking, financial, and healthcare),
reflecting the information sensitivity context. Figure 5 and
Table B6 summarize the studies based on the type of
information collected, sorted by different levels of analysis.
(Studies that do not consider a specific type of information
often simply refer to phrases such as “information that most
individuals would categorize as private.” These studies are
noted in the “general” category in Figure 5 and Table B6.)
The second context, industry sector, is considered in Table
B7. For example, there are many IS studies that explore
privacy in e-commerce, marketing, and healthcare. The third
context, the political context of general privacy, is especially
evident in the U.S. and European legal framework and constitutions. Many legal and social scholars view general privacy
as a value that must be balanced against other values that
individuals see as important, including the rights of self, freedom of the press, law and order, and national security (Etzioni
1999; Regan 1995). Most of the literature that examines
general privacy issues in these political contexts is normative
and sometimes emotionally charged, usually reflecting the
authors’ strong beliefs in their causes (see Table B8).
The fourth context, technological applications, complicates
privacy research due to the potential duality of its relationships. On the one hand, it is often covertly assumed that new
technological applications will lead to new concerns and
outcomes; at the same time, privacy protections could be
implemented through privacy-enhancing technological applications. Although some of the studies have overlapped with
some of the studies discussed in Question #2, the majority of
these investigations have proceeded on largely parallel lines
that assume a new stream of research is warranted for each
new technological
MIS Quarterly Vol. 35 No. 4/December 2011
1003
Smith et al./Information Privacy Research
Table 2. Influences of Technological Attributes on Privacy Concerns
Technological
Attributes
Direct Marketing
Internet and ecommerce
Data Mining and
Profiling
(personalization)
Monitoring and
Surveillance
Communication
(email, IM, and
SMS)
Ubiquitous
Computing
(Mobile and
Location-Enhanced
Technologies and
RFID)
Web 2.0
(Online Social
Networks)
Individual Level
Blattberg and Deighton 1991;
Campbell 1997; Culnan 1993,
1995; Milne 1997; Milne et al.
1999; Nowak and Phelps 1992,
1997; Sheehan and Hoy 1999;
Smith et al. 1996
Acquisti and Varian 2005; Dhillon
and Moores 2001; Dinev and Hart
2004, 2006; Dolnicar and Jordaan
2007; Hoffman et al. 1999;
Malhotra et al. 2004; Odlyzko 2004;
Van Slyke et al. 2006; Taylor
2004a, 2004b
Awad and Krishnan 2006;
Chellappa and Sin 2005; Cranor
2003; Kobsa 2002, 2007
Allen et al. 2007; Fairweather 1999;
Tabak and Smith 2005; Zweig and
Webster 2002; Zweig and Webster
2003
Häkkilä and Chatfield 2005; Meeks
1997, 1999
Bruner and Kumar 2007; Junglas
and Spitzmüller 2005; Lederer et
al. 2003; McGinity 2000; Sheng et
al. 2008; Unni and Harmon 2007;
Xu and Gupta 2009; Xu and Teo
2004; Xu et al. 2010
Acquisti and Gross 2006; Boyd
2008; Boyd and Ellison 2007;
Dinev et al. 2009; Dwyer 2007;
Dwyer et al. 2007; Gross and
Acquisti 2005; Hoadley et al. 2010;
Jagatic et al. 2007; Xu et al. 2008
Group Level
Grinter and
Palen 2002; Ito
and Daisuke
2003; Ling 2004;
Shapiro 1998
Organizational Level
Societal Level
Bowie and Jamal 2006;
Culnan 2000;
Henderson 1999;
McRobb and Rogerson
2004; Milne and Culnan
2002; Schwaig et al.
2006; Shah et al. 2007
Laudon 1996
Ariss 2002; D'Urso
2006; Stone-Romero et
al. 2003; Tabak and
Smith 2005
Sipior and Ward 1995;
Sipior et al. 1998;
Weisband and Reinig
1995
Dinev et al.
2006a; Kim
2004
Clarke 2001
Hodge 2006
domain. As shown in Table 2, various studies have been conducted to examine how technological applications influence
privacy concerns for individuals, groups, organizations and
societies.
streams. In our view, the answer to Question #3 (“To what
extent does context matter in the relationships between
privacy and other constructs?”) is “much less than a firstorder consideration of the research stream would lead one to
believe.”
Looking Across Contexts
One can understand why individual researchers (especially in
the IS domain) have embraced this path. Most of the new
contexts are technologically driven, and thus the IT artifact
and the impact of the new technological application on
privacy is evident, making the research relevant for publi-
Although researchers have explored privacy-related research
questions in different contexts, it remains unclear that the
context-specific nuances actually merit parallel research
1004
MIS Quarterly Vol. 35 No. 4/December 2011
Smith et al./Information Privacy Research
cation in IS journals. Moreover, the historical precedent of
the Internet as a technology that completely changed the
landscape of the privacy conversation has given scholars
reason to suspect that another type of technology may once
again turn around our understanding of privacy. Indeed, a
similar approach is taken by legal scholars who are regularly
engaged in research on how a specific new legal framework
or law passed by the U.S. Congress (or another nation’s
legislative body) would impact privacy.
Yet we note that while these types of studies provide high
relevance and illustrate the IT artifact’s salience, the cumulative impact of such context-driven studies may produce an
overall contribution to knowledge that is suboptimal unless
researchers are constantly aware of an over-arching model.
Although it is certainly necessary to ascertain the extent to
which data from different contexts converge and diverge
before such data are pooled, far more could eventually be
learned from a macro perspective. As we will discuss in the
next section, “Future Research and Conclusion,” some minor
parameterization of an APCO framework (as in our Figure 3)
should accommodate the majority of the studies that are now
context-specific. This would lead to a more robust framework
that would move the frontier of knowledge forward far more
quickly.
Future Research and Conclusion
We draw the following overarching conclusions from the
literature review:
•
•
A large body of normative studies of privacy has accumulated. Many of these offerings are politically
engaging or emotionally charged with strong beliefs
about the ethics and norms that underlie democratic
societies. While these studies have contributed to
privacy research in their own merit, they also serve as
excellent inspiration and motivation for research
advancement, bringing to light hot and disputed issues,
conflicts, and contradictions. There are many rich and
valid theoretical developments and models in the body of
purely descriptive studies on general privacy in
philosophy, sociology, and law that have not been
addressed in empirical research on privacy. However,
rigorous empirically descriptive studies that either trace
processes associated with, or test implied assertions from,
these value-laden arguments could add great value.
Most of the empirically descriptive studies that have
addressed linkages between antecedents and privacy concerns have focused on individual perceptions. For the
most part, the studies that have associated privacy concerns with outcomes have been concerned with organizational and societal dynamics. Largely missing from the
entire research stream are studies associated with grouplevel privacy. Future empirical studies—both positivist
and interpretive—could profitably be targeted to these
under-researched levels of analysis.
•
Positivist empirical studies will add the greatest value if
they focus on antecedents to privacy concerns and on
actual outcomes. As could be seen in our discussion of
Question #2, only limited attention has been paid to
factors that serve as antecedents to privacy concerns. At
the same time, often due to a reliance on the TRA,
researchers frequently assume that stated intentions will
equate to actual behaviors, an especially tenuous
assumption in light of the privacy paradox.
Once the extent to which data from different contexts
converge and diverge has been ascertained, attempts to forge
a context-neutral, overarching empirical framework should be
encouraged. We now discuss each of these conclusions.
Normative to Empirical
Previous research has provided a plethora of normative arguments that address conundrums ranging from the theoretical
orientation of privacy (e.g., debates about the extent to which
privacy is a human right and ought to be protected as such) to
the regulatory approach that best serves society (e.g., debates
about whether there should be a federal bureau of privacy).
In between, one can find normative arguments associated with
specific matters of privacy interpretation (e.g., debates about
whether an IP address associated with one’s home computer
constitutes “individually identifiable personal data” that
should be protected).
There are “rules” for what constitutes a rigorous normative
argument—often grounded in principles of moral argumentation. Such arguments may include references to empirical
data, but such references are not required since normative
arguments can be made logically without reference to data.
The overall stream of privacy research has benefited from
some of the normative argumentation that has been published.
Further, some of the normative argumentation has yielded
real-world significance in domains such as privacy regulation;
for example, normative arguments undergirded the competing
drivers for institutional structures that can be observed in the
United States and the European Union. But, in our view, a
general extension of the stream of normative privacy argu-
MIS Quarterly Vol. 35 No. 4/December 2011
1005
Smith et al./Information Privacy Research
mentation is unlikely to produce many new insights in the IS
domain that will augur for significant changes in policy or
behavior at any of the units of analysis discussed previously:
individual, group, organization, or society.
However, to the extent that the normative conclusions are
viewed as enlightening and motivating events for empirical
studies that trace processes associated with the implementation of these normative conclusions or that test the impact
on different outcomes, their value rises enormously. For
example, an emotionally charged assertion such as “there is
war on privacy” (e.g., Sobel 1976) or “the system is broken”
(Turow 2003), or a lengthy normative debate regarding
whether privacy should be viewed as a right or as a commodity will yield little additional insight. The normative
debates themselves are unlikely to inform us further in terms
of privacy protection, but an examination of the positivist
nomological models that link the normative conclusions to
different outcomes, or the processes through which that
linkage occurs, could prove quite instructive.
A natural difficulty that arises regarding tests of positivist
models from normative studies is the fact that only a subset of
the normative arguments lends itself to such study, since the
normative conclusions are seldom stated in a format that is
suitable for positivist testing. Unlike many theories in the
social sciences that posit relationships such as “higher levels
of A will be associated with higher levels of B,” normative,
value-laden arguments seldom concern themselves with
whether or not they are testable. Consequently, one might
observe a normative conclusion such as “societies ought to
acknowledge a right to privacy in purchase transactions”
without any clear linkage to outcomes that would, or would
not, follow from such a right. In other words, the normative
theory prescribes that higher levels of “A” should be
provided. It then becomes a task for the positivist researcher
to define which constructs might constitute “B” and to
develop a testable linkage back to “A.”
As they traverse this path, researchers should be quite explicit
in identifying the assumptions on which they are relying, and
they should be sure to defend those assumptions. The normative and purely descriptive studies that provide solid argumentation and reasoning can serve as bridges in constructing
empirically testable models and motivating process tracing
studies. As an example, many researchers rely on the set of
fair information practices (FIPs) that were codified in a 1973
study by the U.S. Department of Health, Education, and
Welfare, and some researchers rotate their positivist models
around these FIPs. But the FIPs do not have the standing of
U.S. law in any omnibus context, and careful listeners will
conclude that some members of U.S. industry groups do not
even embrace certain principles of the FIPs. It is, therefore,
1006
MIS Quarterly Vol. 35 No. 4/December 2011
incumbent upon researchers who assume the validity of the
FIPs to note this clearly in their papers and to defend why the
assumption should hold for the purposes of the positivist
research.
Levels of Analysis
The majority of empirical studies to date have viewed the
individual as the salient unit of analysis. This is to some
degree understandable, since such studies lend themselves to
data collection through written and online surveys, and they
are therefore easier to implement. Further, great insight can
be gained from consideration of individuals’ perceptions of
various privacy-related activities.
However, much of this focus on individual-level privacy perceptions and relationships has been at the expense of our
understanding at other levels of analysis. The challenge, of
course, is that studies at the organizational and societal levels
are necessarily more complex and less conducive to “quick”
data collection techniques such as written and online surveys.
Indeed, most rigorous studies of organizational privacy
policies and practices would likely include a set of exhaustive
interviews with an organization’s members and stakeholders,
and some amount of deep process tracing would also likely be
involved. Such studies are the best approach to uncovering
the somewhat subtle organizational dynamics that drive
privacy policies and practices.
Cross-national and cross-cultural studies are also likely to be
challenging, although one can often simplify the measurement
of certain independent variables in positivist studies by considering previously established metrics. For example, many
countries’ privacy regulatory structures have already been
documented by previous researchers. Similarly, national
public opinion surveys regarding privacy concerns are
available for a number of industrialized countries. While
researchers may still have to struggle with some challenges
when using such external data (e.g., manual adjustment of
degrees of freedom for statistical inference), the structure of
the studies may not be overly complex. On the other hand,
interpretive studies at the cross-national and cross-cultural
levels may be overwhelming in their complexity. For
example, many languages, including those in European
countries (e.g., Russian, French, Italian), do not have a word
for privacy and have adopted the English word.9 One’s ability
to collect interpretive (and, particularly, process-oriented)
data across national and societal boundaries will ordinarily be
9
The etymological origin of privacy is distant from its modern use: the Latin
word privus has an archaic meaning of single and shares a common root with
the word privare, meaning deprive.
Smith et al./Information Privacy Research
limited, perhaps even artificially, due to historical factors.
Even so, as the majority of the literature to date has focused
on the United States, the relatively lesser focus and complexity of the literature in other countries may be due to the
varying legal environs. Presumably, the United States has a
more complex legal environment than Europe, and this may
be partly the reason for the emphasis of the research on the
United States. In any case, a richer focus on international
dimensions of privacy research is needed.
Of particular note is that there have been very few studies that
considered privacy at the small group level. Of course, it is
at the small group level that many supra-organizational
policies and practices come to rest, so the paucity of studies
at this level strikes us as a significant weakness in the privacy
literature stream. Further, within small groups (whether
nestled in larger organizations or not), norms regarding
protection of individuals’ privacy may vary, and the process
through which these norms develop could serve as an
interesting area for future research. It is also likely that small
groups differ in norms regarding protection of the group’s
own information,10 and these differences (and the processes
through which they evolve) also merit future research
consideration. In a related vein, a single individual likely
belongs to more than one group, so (s)he may adhere to
different norms regarding privacy as (s)he travels between
groups. How an individual navigates such different normative
expectations would also be a fruitful domain for additional
research.
In fact, social networking websites such as Facebook and
Twitter offer dynamic examples in which information boundaries are created in groups of various sizes and relationship
levels. One can easily imagine a set of studies that considers
the factors associated with different boundaries regarding
disclosure within and across online groups. Questions such
as “do limited-access groups differ from open access groups
in their information norms and processes?” and “to what
extent does an online group’s administrator’s own information
concerns dictate the privacy norms and processes in the
group?” could be of great interest in this domain.
Additionally, as a precursor to this work, the very definition
of a group merits consideration in light of developments in
online social networking. To the extent that sociological and
communications research on small groups has been undertaken in the past, it has usually been associated with physical
groups in which membership is somewhat static (or, at least,
10
We refrain from referring to the concept of privacy when referring to
information associated with the group, since privacy is best viewed as
referring to information about individuals. Even so, norms regarding the
proprietary nature of group information may vary across groups.
easily clarified) and non-anonymous. However, online groups
are often associated with fluid entry and exit, with the option
of identification through screen names or anonymous avatars.
In addition, what constitutes a “small” group in terms of
sociology and communication (for which traditional theories
were developed) may well be subject to dynamic forces.
Although a physical group’s characteristics may shift
substantially as it grows in size, this may not be as true of an
online group. Clear distinctions (and, likely, reconsiderations
of nomenclature) are warranted.
Yet it is also obvious that, as with organizational studies, it
will not be easy to conduct these inquiries into the privacy
relationships and practices of small groups. Although some
possibilities for surveys do exist, the most helpful studies will
likely be grounded in direct observation of the group members
and their interactions. This suggests a long-term research
agenda that will rely heavily on access to group settings in a
variety of domains. It will be almost impossible to examine
these processes without direct observation or participation
(i.e., through action research).
Antecedents and Outcomes
As suggested above, the most helpful positivist studies will be
those that examine differences in outcomes as a function of
privacy-related independent variables. Outcomes should be
interpreted as actual changes of state or behavior; this is distinct from an examination of attitudes, beliefs, and intentions.
Indeed, it is rare for privacy research to consider actual outcomes in tests of models. Although some exceptions are
emerging (see Cranor et al. 2007; Hui et al. 2007), the general
approach has been to ascertain what subjects report that they
would intend to do in a certain situation—for example,
subjects might report their likelihood of writing to an elected
official to complain about a perceived privacy violation
(Smith et al. 1996).
Particularly because of the privacy paradox, it will behoove
privacy researchers to consider their nomological models as
exhaustive only when they map to actual outcomes at one or
more levels of analysis. Among others, outcomes such as
documented violations of privacy, successful prosecutions of
privacy violations, and cross-national data flow stoppages
would all serve as salient outcomes in testable models. At the
organizational level, an interesting contribution would be to
reveal specific organizational outcomes and consequential
decisions made after breaches of personal information considered private. In contrast, much less insight would be
gained through tests of models that culminate with dependent
variables that are perceptual in nature—for example, a survey
MIS Quarterly Vol. 35 No. 4/December 2011
1007
Smith et al./Information Privacy Research
of subjects’ differing levels of stated privacy concern as
associated with different uses of personal data. To the extent
that a privacy paradox exists, it is obvious that such
perceptual measures are quite distinct from measures that will
provide insight to decision-makers.
Overarching APCO Macro Model
Finally, we argue that positivist privacy researchers should
keep their eye on an optimized APCO macro model that
eventually includes an expanded set of antecedents as well as
an exhaustive set of outcomes. The ultimate objective should
be a macro model that will prove useful across disciplines and
contexts.
It is well beyond the scope of the present endeavor to propose
this full APCO macro model. As a nod to the attributes of
emerging technological applications and other contextual
factors, the macro model may require some amount of
parameterization if contextual differences are demonstrated to
be clearly salient. For example, one can envision a macro
model that includes a set of antecedents that would apply to
all types of personal information, such as financial and
medical. Through parameterization, some of the antecedents
could be hypothesized to hold with greater weight for some of
the information types than for others based on findings from
studies that target these contextual differences. Yet, all
researchers should be aware of the exhaustive set of antecedents, as there is little need for each discipline or sector to
investigate its own set of antecedents. Similarly, context
parameterization could be used to highlight outcomes that
would be more or less salient for different contexts.
The development of this macro model would be arduous and
contentious, but the ultimate payoff for the gestalt of privacy
research would be immense. With the macro model in place,
individual researchers could then contribute studies that
addressed a clear subset of the antecedents and outcomes,
with possible parameterization, always confident that their
findings would contribute a clearly annotated brick in the wall
of privacy knowledge.
Conclusion
The stream of modern privacy research had its genesis in the
1970s. In the subsequent four decades, a number of useful
studies have been conducted and published, but the overall
research stream has been suboptimized because of its
disjointed nature.
1008
MIS Quarterly Vol. 35 No. 4/December 2011
We believe that our recommendations for future research in
privacy should lead to a more cohesive stream of literature
that yields actionable steps for individuals, managers, and
regulators.
Acknowledgments
The authors are very grateful to Lynne Markus, Senior Editor, for
her encouragement and direction in helping them develop this
manuscript. They are also grateful to the associate editor, Paul
Pavlou, for his helpful guidance, and to the three anonymous
reviewers for their constructive advice and very helpful comments
on earlier versions of this manuscript. Heng Xu gratefully acknowledges the financial support of the National Science Foundation
under grant CNS-0953749.
References
Ackerman, M. 2004. “Privacy in Pervasive Environments: Next
Generation Labeling Protocols,” Personal and Ubiquitous Computing (8:6), pp. 430-439.
Acquisti, A. 2004. “Privacy in Electronic Commerce and the
Economics of Immediate Gratification,” in Proceedings of the 5th
ACM Electronic Commerce Conference, New York: ACM Press,
pp. 21-29.
Acquisti, A., and Gross, R. 2006. “Imagined Communities:
Awareness, Information Sharing, and Privacy on the Facebook,”
in Proceedings of 6th Privacy Enhancing Technologies Symposium, Cambridge, UK, June 28-30, pp. 36-58.
Acquisti, A., and Grossklags, J. 2005a. “Privacy and Rationality in
Individual Decision Making,” IEEE Security & Privacy (3:1), pp.
26-33.
Acquisti, A., and Grossklags, J. 2005b. “Uncertainty, Ambiguity
and Privacy,” in Proceedings of the 4th Annual Workshop Economics and Information Security, Cambridge, MA, June 2-3, pp.
2-3.
Acquisti, A., and Varian, H. R. 2005. “Conditioning Prices on Purchase History,” Marketing Science (24:3), pp. 367-381.
Alderman, E., and Kennedy, C. 1997. The Right to Privacy, New
York: Vintage Books.
Allen, M. W., Coopman, S. J., Hart, J. L., and Walker, K. L. 2007.
“Workplace Surveillance and Managing Privacy Boundaries,”
Management Communication Quarterly (21:2), pp. 172-200.
Altman, I. 1975. The Environment and Social Behavior: Privacy,
Personal Space, Territory, and Crowding, Monterey, CA:
Brooks/Cole Publishing.
Altman, I. 1977. “Privacy Regulation: Culturally Universal or Culturally Specific?,” Journal of Social Issues (33:3), pp. 66-84.
Andrade, E. B., Kaltcheva, V., and Weitz, B. 2002. “SelfDisclosure on the Web: The Impact of Privacy Policy, Reward,
and Brand Reputation,” in Advances in Consumer Research, S.
M. Broniarczyk and K. Nakamoto (eds.), Valdosta, GA:
Association for Consumer Research, pp. 350-353.
Smith et al./Information Privacy Research
Ariss, S. S. 2002. “Computer Monitoring: Benefits and Pitfalls
Facing Management,” Information & Management (39:7), pp.
553-558.
Ashworth, L., and Free, C. 2006. “Marketing Dataveillance and
Digital Privacy: Using Theories of Justice to Understand Consumers Online Privacy Concerns,” Journal of Business Ethics
(67:2), pp. 107-123.
Awad, N. F., and Krishnan, M. S. 2006. “The Personalization
Privacy Paradox: An Empirical Evaluation of Information Transparency and the Willingness to be Profiled Online for Personalization,” MIS Quarterly (30:1), pp. 13-28.
Bansal, G., Zahedi, F., and Gefen, D. 2008. “The Moderating
Influence of Privacy Concern on the Efficacy of Privacy
Assurance Mechanisms for Building Trust: A Multiple-Context
Investigation,” in Proceedings of 29th International Conference
on Information Systems. Paris, France, December 14-17.
Bansal, G., Zahedi, F. M., and Gefen, D. 2010. “The Impact of
Personal Dispositions on Information Sensitivity, Privacy
Concern and Trust in Disclosing Health Information Online,”
Decision Support Systems (49:2), pp. 138-150.
Belanger, F., Hiller, J. S., and Smith, W. J. 2002. “Trustworthiness
in Electronic Commerce: The Role of Privacy, Security, and Site
Attributes,” Journal of Strategic Information Systems (11:3-4),
pp. 245-270.
Bellman, S., Johnson, E. J., Kobrin, S. J., and Lohse, G. L. 2004.
“International Differences in Information Privacy Concerns: A
Global Survey of Consumers,” Information Society (20:5), pp.
313-324.
Bennett, C. J. 1992. Regulating Privacy: Data Protection and
Public Policy in Europe and the United States, Ithaca, NY:
Cornell University Press.
Bennett, C. J. 1995. The Political Economy of Privacy: A Review
of the Literature, Hackensack, NJ: Center for Social and Legal
Research.
Bennett, C. J., and Raab, C. D. 1997. “The Adequacy of Privacy:
The European Union Data Protection Directive and the North
American Response,” Information Society (13:3), pp. 245-263.
Benson, D. 1983. “A Field Study of End User Computing:
Findings and Issues,” MIS Quarterly (7:4), pp. 35-45.
Blattberg, R. C., and Deighton, J. 1991. “Interactive Marketing:
Exploiting the Age of Addressability,” Sloan Management
Review (33:1), pp. 5-14.
Bok, S. 1989. Secrets: On the Ethics of Concealment and
Revelation, New York: Vintage.
Bowie, N. E., and Jamal, K. 2006. “Privacy Rights on the Internet:
Self-Regulation or Government Regulation?,” Business Ethics
Quarterly (16:3), pp. 323-342.
Boyd, D. 2008. “Facebook’s Privacy Trainwreck: Exposure,
Invasion and Social Convergence,” Convergence: The International Journal of Research into New Media Technologies
(14:1), pp. 13-20.
Boyd, D., and Ellison, N. B. 2007. “Social Network Sites: Definition, History, and Scholarship,” Journal of Computer-Mediated
Communication (13:1), Article 11.
Breckenridge, A. C. 1970. The Right to Privacy, Lincoln, NE:
University of Nebraska Press, Lincoln.
Brenton, M. 1964. The Privacy Invaders, New York: Coward
McCann.
Bruner II, G. C., and Kumar, A. 2007. “Attitude toward LocationBased Advertising,” Journal of Interactive Advertising (7:2)
(http://www.jiad.org/article89).
Budnitz, M. E. 1998. “Privacy Protection for Consumer Transactions in Electronic Commerce: Why Self-Regulation Is
Inadequate “ South Carolina Law Review (49).
Bynum, T. 2008. “Computer and Information Ethics,” in The
Stanford Encyclopedia of Philosophy (Winter 2008 Edition), E.
N. Zalta (ed.),Stanford, CA: Center for the Study of Language
and Information (http://plato.stanford.edu/).
Camp, L. J. 1999. “Web Security and Privacy: An American Perspective,” Information Society (15:4), pp. 249-256.
Campbell, A. J. 1997. “Relationship Marketing in Consumer Markets,” Journal of Direct Marketing(11:3), pp. 44-57.
Campbell, J. E., and Carlson, M. 2002. “Panopticon.com: Online
Surveillance and the Commodification of Privacy,” Journal of
Broadcasting & Electronic Media (46:4), pp. 586-606.
Caudill, E. M., and Murphy, P. E. 2000. “Consumer Online
Privacy: Legal and Ethical Issues,” Journal of Public Policy &
Marketing (19:1), pp. 7-19.
Cespedes, F. V., and Smith, H. J. 1993. “Database Marketing:
New Rules for Policy and Practice,” Sloan Management Review
(34:4), pp. 7-22.
Chellappa, R. K. 2008. “Consumers’ Trust in Electronic Commerce
Transactions: The Role of Perceived Privacy and Perceived
Security,” unpublished paper, Emory University, Atlanta, GA.
Chellappa, R. K., and Sin, R. 2005. “Personalization Versus
Privacy: An Empirical Examination of the Online Consumer’s
Dilemma,” Information Technology and Management (6:2), pp.
181-202.
Chen, K., and Rea, A. I. 2004. “Protecting Personal Information
Online: A Survey of User Privacy Concerns and Control Techniques,” Journal of Computer Information Systems (44:4), pp.
85-92.
Clark, T. D., Jones, M. C., and Armstrong, C. P. 2007. “The Dynamic Structure of Management Support Systems: Theory
Development, Research Focus, and Direction,” MIS Quarterly
(31:3), pp. 579-615.
Clarke, R. 2001. “Person Location and Person Tracking: Technologies, Risks and Policy Implications,” Information Technology & People (14:2), pp. 206-231.
Cohen, J. E. 2001. “Privacy, Ideology, and Technology: A
Response to Jeffrey Rosen,” Georgetown Law Journal (89), p.
2029.
Consumers-Union. 2008. “Consumer Reports Poll: Americans
Extremely Concerned About Internet Privacy,” September 25
(http://www.consumersunion.org/pub/core_telecom_and_utiliti
es/006189.html).
Copp, D. (ed.). 2007. The Oxford Handbook of Ethical Theory,
New York: Oxford University Press,
Cranor, L. F. 2003. “‘I Didn’t Buy it for Myself’: Privacy and
Ecommerce Personalization,” in Proceedings of the 2003 ACM
Workshop on Privacy in the Electronic Society, Washington, DC,
October 27-30, pp. 111-117
MIS Quarterly Vol. 35 No. 4/December 2011
1009
Smith et al./Information Privacy Research
Cranor, L. F., Egelman, S., Tsai, J., and Acquisti, A. 2007. “The
Effect of Online Privacy Information on Purchasing Behavior:
An Experimental Study,” in Proceedings of 28th International
Conference on Information Systems, Montréal, Canada,
December 9-12.
Culnan, M. J. 1985. “Consumer Awareness of Name Removal
Procedures: Implication for Direct Marketing,” Journal of Interactive Marketing (9), pp. 10-19.
Culnan, M. J. 1993. “‘How Did They Get My Name’? An
Exploratory Investigation of Consumer Attitudes toward Secondary Information Use,” MIS Quarterly (17:3), pp. 341-364.
Culnan, M. J. 2000. “Protecting Privacy Online: Is Self-Regulation
Working?,” Journal of Public Policy and Marketing (19:1), pp.
20-26.
Culnan, M. J., and Armstrong, P. K. 1999. “Information Privacy
Concerns, Procedural Fairness and Impersonal Trust: An Empirical Investigation,” Organization Science (10:1), pp. 104-115.
Culnan, M. J., and Bies, R. J. 2003. “Consumer Privacy:
Balancing Economic and Justice Considerations,” Journal of
Social Issues (59:2), pp. 323-342.
Culnan, M. J., and Williams, C. C. 2009. “How Ethics Can
Enhance Organizational Privacy: Lessons from the ChoicePoint
and TJX Data Breaches,” MIS Quarterly (33:4), pp. 673-687.
D’Urso, S. 2006. “Who’s Watching Us at Work? Toward a Structural–Perceptual Model of Electronic Monitoring and
Surveillance in Organizations,” Communication Theory (16:3),
pp. 281-303.
Davies, S. G. 1997. “Re-Engineering the Right to Privacy: How
Privacy Has Been Transformed from a Right to a Commodity,”
in Technology and Privacy: The New Landscape, P. E. Agre and
M. Rotenberg (eds.), Cambridge, MA: MIT Press, pp. 143-165.
Dhillon, G. S., and Moores, T. 2001. “Internet Privacy: Interpreting Key Issues,” Information Resources Management Journal
(14:4), pp. 33-37.
Dholakia, N., and Zwick, D. 2001. “Contrasting European and
American Approaches to Privacy in Electronic Markets: A
Philosophical Perspective,” Electronic Markets (11:2), pp.
116-120.
Dinev, T., Bellotto, M., Hart, P., Russo, V., Serra, I., and Colautti,
C. 2006a. “Internet Users’ Privacy Concerns and Beliefs About
Government Surveillance: An Exploratory Study of Differences
between Italy and the United States,” Journal of Global Information Management (14:4), pp. 57-93.
Dinev, T., Bellotto, M., Hart, P., Russo, V., Serra, I., and
Colautti, C. 2006b. “Privacy Calculus Model in E-Commerce:
A Study of Italy and the United States,” European Journal of
Information Systems (15:4), pp. 389-402.
Dinev, T., and Hart, P. 2004. “Internet Privacy Concerns and Their
Antecedents: Measurement Validity and a Regression Model,”
Behavior and Information Technology (23:6), pp. 413-423.
Dinev, T., and Hart, P. 2006. “An Extended Privacy Calculus
Model for E-Commerce Transactions,” Information Systems
Research (17:1), pp. 61-80.
Dinev, T., and Hart, P. 2007. “Privacy Concerns and Levels of
Information Exchange: An Empirical Investigation of Intended
E-Services Use,” E-Service Journal (4:3), pp. 25-61.
1010
MIS Quarterly Vol. 35 No. 4/December 2011
Dinev, T., Xu, H., and Smith, H. J. 2009. “Information Privacy
Values, Beliefs and Attitudes: An Empirical Analysis of Web 2.0
Privacy,” in Proceedings of 42nd Hawaii International Conference on System Sciences, Los Alamitos, CA: IEEE Computer
Society Press.
Dolnicar, S., and Jordaan, Y. 2007. “A Market-Oriented Approach
to Responsibly Managing Information Privacy Concerns in
Direct Marketing,” Journal of Advertising (36:2), pp. 123-149.
Dwyer, C. 2007. “Digital Relationships in the ‘MySpace’ Generation: Results from a Qualitative Study,” in Proceedings of 40th
Hawaii International Conference on System Sciences, Los
Alamitos, CA: IEEE Computer Society Press.
Dwyer, C., Hiltz, S., and Passerini, K. 2007. “Trust and Privacy
Concern Within Social Networking Sites: A Comparison of
Facebook and MySpace,” in Proceedings of the 13th Americas
Conference on Information Systems (AMCIS), Keystone, CO,
August 9-12.
Eastlick, M. A., Lotz, S. L., and Warrington, P. 2006. “Understanding Online B-to-C Relationships: An Integrated Model of
Privacy Concerns, Trust, and Commitment,” Journal of Business
Research (59:8), pp. 877-886.
Edelman, B. 2011. “Adverse Selection in Online ‘Trust’ Certifications and Search Results,” Electronic Commerce Research and
Applications (1), pp. 17-25.
Etzioni, A. 1999. The Limits of Privacy, New York: Basic Books.
Fairweather, N. B. 1999. “Surveillance in Employment: The Case
of Teleworking,” Journal of Business Ethics (22:1), pp. 39-49.
Featherman, M. S., and Pavlou, P. A. 2003. “Predicting E-Services
Adoption: A Perceived Risk Facets Perspective,” International
Journal of Human-Computer Studies (59:4), pp. 451-474.
Fishbein, M., and Ajzen, I. 1975. Belief, Attitude, Intention, and
Behavior: An Introduction to Theory and Research, Reading,
MA: Addison-Wesley.
Foxman, E. R., and Kilcoyne, P. 1993. “Information Technology,
Marketing Practice, and Consumer Privacy: Ethical Issues,”
Journal of Public Policy & Marketing (12:1), pp. 106-119.
Garfinkel, S. 2000. Database Nation: The Death of Privacy in the
21st Century, Sebastopol, CA: O’Reilly Media.
Gomez, J., Pinnick, T., and Soltani, A. 2009. “KnowPrivacy: The
Current State of Web Privacy, Data Collection, and Information
Sharing,” School of Information, University of California
Berkeley (http://www.knowprivacy.org/).
Goodhue, D. L., and Straub, D. W. 1991. “Security Concerns of
System Users: A Study of Perceptions of Ahe adequacy of
Security,” Information and Management (20:1), pp. 13-27.
Grinter, R., and Palen, L. 2002. “Instant Messaging in Teen Life,”
in Proceedings of the 2002 ACM Conference on Computer Supported Cooperative Work, New Orleans, Louisiana, pp. 21-30.
Gross, R., and Acquisti, A. 2005. “Information Revelation and
Privacy in Online Social Networks,” in Proceedings of the 2005
ACM Workshop on Privacy in the Electronic Society, Alexandria,
VA,.
Häkkilä, J., and Chatfield, C. 2005. “‘It’s like If You Opened
Someone Else’s Letter’: User Perceived Privacy and Social
Practices with SMS Communication,” in Proceedings of the
Mobile HCI Conference, Salzburg, Austria, pp. 219-222.
Smith et al./Information Privacy Research
Hann, I.-H., Hui, K.-L., Lee, S. Y. T., and Png, I. P. L. 2008.
“Overcoming Online Information Privacy Concerns: An
Information-Processing Theory Approach,” Journal of
Management Information Systems (24:2), pp. 13-42.
Henderson, H. 1999. Privacy in the Information Age, New York:
Facts On File.
Herson, L. J. R. 1990. Politics of Ideas: Political Theory and
American Public Policy, Long Grove, IL: Waveland Press.
Hirshleifer, J. 1980. “Privacy: Its Origin, Function, and Future,”
The Journal of Legal Studies (9:4), pp. 649-664.
Hoadley, C. M., Xu, H., Lee, J. J., and Rosson, M. B. 2010.
“Privacy as Information Access and Illusory Control: The Case
of the Facebook News Feed Privacy Outcry,” Electronic
Commerce Research and Applications (9:1), pp. 50-60.
Hodge, M. J. 2006. “Fourth Amendment and Privacy Issues on the
New Internet: Facebook.com and Myspace.com,” Southern
Illinois University Law Journal (31), pp. 95-123.
Hoffman, D. L., Novak, T. P., and Peralta, M. A. 1999. “Information Privacy in the Marketspace: Implications for the Commercial Uses of Anonymity on the Web,” Information Society
(15:2), pp. 129-139.
Hui, K.-L., Tan, B. C. Y., and Goh, C.-Y. 2006. “Online Information Disclosure: Motivators and Measurements,” ACM Transactions on Internet Technology (6), pp. 415-441.
Hui, K. L., Teo, H. H., and Lee, S. Y. T. 2007. “The Value of
Privacy Assurance: An Exploratory Field Experiment,” MIS
Quarterly (31:1), pp. 19-33.
Ito, M., and Daisuke, O. 2003. “Mobile Phones, Japanese Youth
and the Replacement of Social Contact,” in Proceedings of Front
Stage/Back Stage: Mobile Communication and the Renegotiation
of the Social Sphere, R. Ling and P. Pedersen (eds.), Grimstad,
Norway.
Jagatic, T. N., Johnson, N. A., Jakobsson, M., and Menczer, F.
2007. “Social Phishing,” Communications of the ACM (50:10),
pp. 94-100.
Jarvenpaa, S. L., and Leidner, D. E. 1999. “Communication and
Trust in Global Virtual Teams,” Organization Science (10:6), pp.
791-815.
Jarvenpaa, S. L., and Tiller, E. H. 1999. “Integrating Market, Technology, and Policy Opportunities in E-Business Strategy,”
Journal of Strategic Information Systems (8:3), pp. 235-249.
Jarvenpaa, S. L., Tractinsky, N., and Vitale, M. 2000. “Consumer
Trust in an Internet Store,” Information Technology and
Management (1:12), pp. 45-71.
Jensen, C., Potts, C., and Jensen, C. 2005. “Privacy Practices of
Internet Users: Self-Reports versus Observed Behavior,” International Journal of Human-Computer Studies (63), pp. 203–227.
Jentzsch, N. 2001. “The Economics and Regulation of Financial
Privacy: A Comparative Analysis of the United States and
Europe,” Working Paper, John F. Kennedy Institute.
Johnson, C. A. 1974. “Privacy as Personal Control,” in ManEnvironment Interactions: Evaluations and Applications: Part
2, D. H. Carson (ed.), Washington, DC: Environmental Design
Research Association, pp. 83-100.
Junglas, A. I., and Spitzmüller, C. 2005. “A Research Model for
Studying Privacy Concerns Pertaining to Location-Based Ser-
vices,” in Proceedings of the 38th Hawaii International Conference on System Sciences, Los Alamitos, CA: IEEE Computing Society Press.
Kelvin, P. 1973. “A Social Psychological Examination of Privacy,”
British Journal of Social and Clinical Psychology (12), pp.
248-261.
Kim, M.-C. 2004. “Surveillance Technology, Privacy and Social
Control: With Reference to the Case of the Electronic National
Identification Card in South Korea,” International Sociology
(19:2), pp. 193-213.
Kimmel, A. J. 1996. Ethical Issues in Behavioral Research,
Boston: Blackwell Publishers.
Klopfer, P. H., and Rubenstein, D. I. 1977. “The Concept Privacy
and Its Biological Basis,” Journal of Social Issues (33:3), pp.
52-65.
Kobsa, A. 2002. “Personalized Hypermedia and International
Privacy,” Communications of the ACM (45:5), pp. 64-67.
Kobsa, A. 2007. “Privacy-Enhanced Personalization,” Communications of the ACM (50:8), pp. 24-33.
Kobsa, A., and Schreck, J. 2003. “Privacy through Pseudonymity
in User-Adaptive Systems,” ACM Transactions on Internet Technology (3), pp. 149–183.
LaRose, R., and Rifon, N. 2006. “Your Privacy Is Assured—of
Being Disturbed: Comparing Web Sites with and Without
Privacy Seals,” New Media and Society (8:6), pp. 1009-1029.
Laudon, K. C. 1996. “Markets and Privacy,” Communications of
the ACM (39:9), pp. 92-104.
Laufer, R. S., and Wolfe, M. 1977. “Privacy as a Concept and a
Social Issue: Multidimensional Developmental Theory,” Journal
of Social Issues (33:3), pp. 22-42.
Lederer, S., Hong, J. I., Dey, A. K., and Landay, J. A. 2004.
“Personal Privacy through Understanding and Action: Five
Pitfalls for Designers,” Personal and Ubiquitous Computing
(8:6), pp. 440-454.
Lederer, S., Mankoff, J., and Dey, K. A. 2003. “Who Wants to
Know What When? Privacy Preference Determinants in
Ubiquitous Computing,” in Proceedings of Conferences on
Human Factors in Computing Systems, Fort Lauderdale, Florida.
Leidner, D., and Kayworth, T. 2006. “Review: A Review of
Culture in Information Systems Research: Toward a Theory of
Information Technology Culture Conflict,” MIS Quarterly (30:2),
pp. 357-399.
Ling, R. 2004. “The Coordination of Everyday Life,” in The
Mobile Connection: The Cell Phone’s Impact on Society, R.
Ling (ed.), Amsterdam: Elsevier, pp. 57-81.
Lu, Y., Tan, B. C. Y., and Hui, K.-L. 2004. “Inducing Customers
to Disclose Personal Information to Internet Businesses with
Social Adjustment Benefits,” in Proceedings of 25th International
Conference on Information Systems, R. Agarwal, L. J. Kirsch,
and J. I. DeGross (eds.), Washington, DC, December 9-12, pp.
272-281.
Malhotra, N. K., Kim, S. S., and Agarwal, J. 2004. “Internet Users’
Information Privacy Concerns (IUIPC): The Construct, the
Scale, and a Causal Model,” Information Systems Research
(15:4), pp. 336-355.
MIS Quarterly Vol. 35 No. 4/December 2011
1011
Smith et al./Information Privacy Research
Margulis, S. T. 1977a. “Conceptions of Privacy: Current Status
and Next Steps,” Journal of Social Issues (33:3), pp. 5-21.
Margulis, S. T. 1977b. “Privacy as a Behavioral Phenomenon:
Introduction,” Journal of Social Issues (33:3), pp. 1-4.
Margulis, S. T. 2003a. “On the Status and Contribution of Westin’s
and Altman’s Theories of Privacy,” Journal of Social Issues
(59:2), pp. 411-429.
Margulis, S. T. 2003b. “Privacy as a Social Issue and Behavioral
Concept,” Journal of Social Issues (59:2), pp. 243-261.
Marx, G. T. 1999. “What’s in a Name? Some Reflections on the
Sociology of Anonymity,” Information Society (15:2), pp.
99-112.
Marx, G. T. 2001. “Murky Conceptual Waters: The Public and the
Private,” Ethics and Information Technology (3:3), pp. 157-169.
McGinity, M. 2000. “Surfing Your Turf: For a Medium that Holds
Anonymity in High Rregard, Privacy Is Fast Eroding,”
Communications of the ACM (43:4), pp. 19-21.
McLean, D. 1995. Privacy and Its Invasion, Westport, CT:
Praeger.
McRobb, S., and Rogerson, S. 2004. “Are They Really Listening?
An Investigation into Published Online Privacy Policies at the
Beginning of the Third Millennium,” Information Technology &
People (17:4), p 442.
Meeks, B. N. 1997. “Privacy Lost, Anytime, Anywhere,” Communications of the ACM (40:8), pp. 11-13.
Meeks, B. N. 1999. “The Privacy Hoax,” Communications of the
ACM (42:2), pp. 17-19.
Metzger, M. J. 2004. “Privacy, Trust, and Disclosure: Exploring
Barriers to Electronic Commerce,” Journal of ComputerMediated Communication (9:4).
Milberg, S. J., Burke, S. J., Smith, H. J., and Kallman, E. A. 1995.
“Values, Personal Information Privacy, and Regulatory
Approaches,” Communications of the ACM (38:12), pp. 65-74.
Milberg, S. J., Smith, H. J., and Burke, S. J. 2000. “Information
Privacy: Corporate Management and National Regulation,”
Organization Science (11:1), pp. 35-57.
Milne, G. R. 1997. “Consumer Participation in Mailing Lists: A
Field Experiment,” Journal of Public Policy and Marketing
(16:2), pp. 298-309.
Milne, G. R., and Boza, M.-E. 1999. “Trust and Concern in Consumers’ Perceptions of Marketing Information Management
Practices,” Journal of Interactive Marketing (13:1), pp. 5-24.
Milne, G. R., and Culnan, M. J. 2002. “Using the Content of
Online Privacy Notices to Inform Public Policy: A Longitudinal
Analysis of the 1998-2001 US Web Surveys,” Information
Society (18:5), pp. 345-359.
Milne, G. R., and Gordon, E. M. 1993. “Direct Mail PrivacyEfficiency Trade-Offs Within an Implied Social Contract Framework,” Journal of Public Policy and Marketing (12:2), pp.
206-215.
Milne, G. R., and Rohm, A. 2000. “Consumer Privacy and Name
Removal Across Direct Marketing Channels: Exploring Opt-in
and Opt-out Alternatives,” Journal of Public Policy and
Marketing (19:2), pp. 238-249.
1012
MIS Quarterly Vol. 35 No. 4/December 2011
Milne, G. R., Rohm, A., and Boza, M.-E. 1999. “Trust Has to Be
Earned,” in Frontiers of Direct Marketing, J. Phelps (ed.), New
York: Direct Marketing Educational Foundation, pp. 31-41.
Miyazaki, A., and Krishnamurthy, S. 2002. “Internet Seals of
Approval: Effects on Online Privacy Policies and Consumer
Perceptions,” Journal of Consumer Affairs (36:1), pp. 28-49.
Moon, Y. 2000. “Intimate Exchanges: Using Computers to Elicit
Self-Disclosure from Consumers,” Journal of Consumer
Research (26), pp. 323-339.
Mowday, R. T., and Sutton, R. I. 1993. “Organizational Behavior:
Linking Individuals and Groups to Organizational Contexts,”
Annual Reviews in Psychology (44:1), pp. 195-229.
Nissenbaum, H. 1998. “Protecting Privacy in an Information Age:
The Problem of Privacy in Public,” Law and Philosophy (17:5),
pp. 559-596.
Nissenbaum, H. 1999. “The Meaning of Anonymity in an Information Age,” The Information Society (15:2), pp. 141-144.
Norberg, P. A., and Horne, D. R. 2007. “Privacy Attitudes and
Privacy-Related Behavior,” Psychology and Marketing (24:10),
pp. 829-847.
Norberg, P. A., Horne, D. R., and Horne, D. A. 2007. “The Privacy
Paradox: Personal Information Disclosure Intentions Versus
Behaviors,” Journal of Consumer Affairs (41:1), pp. 100-126.
Nowak, G. J., and Phelps, J. 1992. “Understanding Privacy Concerns: An Assessment of Consumers’s Information-Related
Knowledge and Beliefs,” Journal of Direct Marketing (6:4), pp.
28-39.
Nowak, G. J., and Phelps, J. 1995. “Direct Marketing and the Use
of Individual-Level Consumer Information: Determining How
and When ‘Privacy’ Matters,” Journal of Direct Marketing (9:3),
pp. 46-60.
Nowak, G. J., and Phelps, J. 1997. “Direct Marketing and the Use
of Individual-Level Consumer Information: Determining How
and When “Privacy” Matters,” Journal of Direct Marketing
(11:4), pp. 94-108.
O’Donoghue, T., and Rabin, M.
2001.
“Choice and
Procrastination,” Quarterly Journal of Economics (116), pp.
121-160.
Odlyzko, A. 2004. “Privacy, Economics, and Price Discrimination
on the Internet,” in Economics of Information Security, L. J.
Camp and S. Lewis (eds.), New York: Springer, pp. 187-211.
Pavlou, P. A. 2003. “Consumer Acceptance of Electronic Commerce: Integrating Trust and Risk with the Technology
Acceptance Model,” International Journal of Electronic
Commerce (7:3), pp. 69-103.
Pavlou, P. A., and Gefen, D. 2004. “Building Effective Online
Marketplaces with Institution-Based Trust,” Information Systems
Research (15:1), pp. 37-59.
Pearlson, K. E., and Saunders, C. S. 2009. Managing and Using
Information Systems (4th ed.), Hoboken, NJ: John Wiley & Sons.
Pedersen, D. M. 1997. “Psychological Functions of Privacy,”
Journal of Environmental Psychology (17:2), pp. 147-156.
Pedersen, D. M. 1999. “Model for Types of Privacy by Privacy
Functions,” Journal of Environmental Psychology (19:4), pp.
397-405.
Smith et al./Information Privacy Research
Peter, J. P., and Tarpey, S. L. X. 1997. “A Comparative Analysis
of Three Consumer Decision Strategies,” Journal of Consumer
Research (2:1), p. 29.
Phelps, J., Nowak, G., and Ferrell, E. 2000. “Privacy Concerns and
Consumer Willingness to Provide Personal Information,” Journal
of Public Policy and Marketing (19:1), pp. 27-41.
Ponemon Institute. 2007. “Database Security 2007: Threats and
Priorities Within IT Database Infrastructure,” Traverse City, MI
(available at http://www.appsecinc.com/techdocs/whitepapers/
2007-Ponemon-Database-Security -Study -Sponsoredby-Application-Security-Inc.pdf).
Posner, R. A. 1981. “The Economics of Privacy,” American
Economic Review (71:2), pp. 405-409.
Posner, R. A. 1984. “An Economic Theory of Privacy,” in Philosophical Dimensions of Privacy: An Anthology, F. Schoeman
(ed.), New York: Cambridge University Press, pp. 333-345.
Preston, J. 2004. “Judge Strikes Down Section of Patriot Act
Allowing Secret Subpoenas of Internet Data,” The New York
Times, Technology Section, September 30.
Propp, K. M., and Kreps, G. L. 1994. “A Rose by Any Other
Name: The Vitality of Group Communication Research,”
Communication Studies (45:1), pp. 7-19.
Qian, H., and Scott, C. R. 2007. “Anonymity and Self-Disclosure
on Weblogs,” Journal of Computer-Mediated Communication
(12:4), pp. 1428-1451.
Rabin, M., and O’Donoghue, T. 2000. “The Economics of Immediate Gratification,” Journal of Behavioral Decision Making (13),
pp. 233-250.
Regan, P. M. 1995. Legislating Privacy: Technology, Social
Values, and Public Policy, Chapel Hill, NC: University of North
Carolina Press.
Rensel, A. D., Abbas, J. M., and Rao, H. R. 2006. “Private Transactions in Public Places: An Exploration of the Impact of the
Computer Environment on Public Transactional Web Site Use,”
Journal of the Association for Information Systems (7:1), pp.
19-50.
Richards, N. M., and Solove, D. J. 2007. “Privacy’s Other Path:
Recovering the Law of Confidentiality,” Georgetown Law
Journal (96:1), pp. 123-182.
Rifon, N. J., LaRose, R., and Choi, S. M. 2005. “Your Privacy Is
Sealed: Effects of Web Privacy Seals on Trust and Personal
Disclosures,” Journal of Consumer Affairs (39:2), pp. 339-362.
Rindfleisch, T. C. 1997. “Privacy, Information Technology, and
Health Care,” Communications of the ACM (40:8), pp. 92-100.
Rosen, J. 2000. The Unwanted Gaze: The Destruction of Privacy
in America, New York: Random House.
Schoeman, F. D. 1984: Philosophical Dimensions of Privacy: An
Anthology, New York: Cambridge University Press.
Schoenbachler, D. D., and Gordon, G. L. 2002. “Trust and Customer Willingness to Provide Information in Database-Driven
Relationship Marketing,” Journal of Interactive Marketing
(16:3), pp. 2-16.
Schwaig, K. S., Kane, G. C., and Storey, V. C. 2006. “Compliance
to the Fair Information Practices: How Are the Fortune 500
Handling Online Privacy Disclosures?,” Information & Management (43:7), pp. 805-820.
Shah, J. R., White, G. L., and Cook, J. R. 2007. “Privacy Protection Overseas as Perceived by USA-Based IT Professionals,”
Journal of Global Information Management (15:1), pp. 68-81.
Shapiro, S. 1998. “Places and Spaces: The Historical Interaction
of Technology, Home, and Privacy,” Information Society (14:4),
pp. 275-284.
Sheehan, K. B. 1999. “An Investigation of Gender Differences in
On-Line Privacy Concerns and Resultant Behaviors,” Journal of
Interactive Marketing (13:4), pp. 24-38.
Sheehan, K. B. 2002. “Toward a Typology of Internet Users and
Online Privacy Concerns,” Information Society (18:1), pp. 21-32.
Sheehan, K. B., and Hoy, M. G. 1999. “Using E-Mail To Survey
Internet Users in the United States: Methodology and Assessment,” Journal of Computer-Mediated Communication (4:3).
Sheehan, K. B., and Hoy, M. G. 2000. “Dimensions of Privacy
Concern among Online Consumers,” Journal of Public Policy &
Marketing (19:1), pp. 62-73.
Sheng, H., Nah, F., and Siau, K. 2008. “An Experimental Study on
U-Commerce Adoption: The Impact of Personalization and
Privacy Concerns,” Journal of Associations for Information
Systems (9:16), Article 15.
Singer, P. 1991. A Companion to Ethics, Hoboken, NJ: WileyBlackwell.
Sipior, J. C., and Ward, B. T. 1995. “The Ethical and Legal
Quandary of Email Privacy,” Communications of the ACM
(38:12), pp. 48-54.
Sipior, J. C., Ward, B. T., and Rainone, S. M. 1998. “Ethical
Management of Employee E-Mail Privacy,” Information Systems
Management (15:1), pp. 41-47.
Smith, H. J. 2001. “Information Privacy and Marketing: What the
US Should (and Shouldn’t) Learn from Europe,” California
Management Review (43:2), pp. 8-33.
Smith, H. J. 2004. “Information Privacy and its Management,” MIS
Quarterly Executive (3:4), pp. 201-213.
Smith, H. J., Milberg, J. S., and Burke, J. S. 1996. “Information
Privacy: Measuring Individuals’ Concerns About Organizational
Practices,” MIS Quarterly (20:2), pp. 167-196.
Sobel, L. A. 1976. War on Privacy, New York: Facts on File.
Solove, D. J. 2004. The Digital Person: Technology and Privacy
in the Information Age, New York: New York University Press.
Solove, D. J. 2006. “A Taxonomy of Privacy,” University of
Pennsylvania Law Review (154:3), pp. 477-560.
Solove, D. J. 2008. Understanding Privacy, Cambridge, MA:
Harvard University Press.
Stewart, K. A., and Segars, A. H. 2002. “An Empirical Examination of the Concern for Information Privacy Instrument,”
Information Systems Research (13:1), pp. 36-49.
Stone, E. F., and Stone, D. L. 1990. “Privacy in Organizations:
Theoretical Issues, Research Findings, and Protection Mechanisms,” Research in Personnel and Human Resources Management (8:3), pp. 349-411.
Stone-Romero, E. F., Stone, D. L., and Hyatt, D. 2003. “Personnel
Selection Procedures and Invasion of Privacy,” Journal of Social
Issues (59:2), pp. 343-368.
Tabak, F., and Smith, W. P. 2005. “Privacy and Electronic Monitoring in the Workplace: A Model of Managerial Cognition and
MIS Quarterly Vol. 35 No. 4/December 2011
1013
Smith et al./Information Privacy Research
Relational Trust Development “ Employee Responsibilities and
Rights Journal (17:3), pp. 173-189.
Tang, Z., Hu, Y. J., and Smith, M. D. 2008. “Gaining Trust
Through Online Privacy Protection: Self-Regulation, Mandatory
Standards, or Caveat Emptor,” Journal of Management Information Systems (24:4), pp. 153-173.
Taylor, C. R. 2004a. “Consumer Privacy and the Market for Customer Information,” RAND Journal of Economics (35:4), pp.
631-650.
Taylor, C. R. 2004b. “Privacy and Information Acquisition in
Competitive Markets,” Working Paper Series, Berkeley Olin
Program in Law & Economics, Berkeley, CA (http://ideas.
repec.org/p/cdl/oplwec/13271.html).
Tefft, S. K. 1980. Secrecy: A Cross-Cultural Perspective, New
York: Human Sciences Press.
Turow, J. 2003. “Americans & Online Privacy: The System Is
Broken,” report from the Annenberg Public Policy Center,
University of Pennsylvania.
Unni, R., and Harmon, R. 2007. “Perceived Effectiveness of Push
vs. Pull Mobile Location-Based Advertising,” Journal of Interactive Advertising (7:2) (http://www.jiad.org/article91).
Van Slyke, C., Shim, J. T., Johnson, R., and Jiang, J. J. 2006.
“Concern for Information Privacy, Risk Perception and Online
Consumer Purchasing,” Journal of the Association for Information Systems (7:6), pp. 415-444.
Walczuch, R. M., and Steeghs, L. 2001. “Implications of the New
EU Directive on Data Protection for Multinational Corporations,”
Information Technology & People (14:2), pp. 142-162.
Wald, M. 2004. “Airline Gave Government Information on
Passengers,” New York Times, Technology Section, January 18.
Waldo, J., Lin, H., and Millett, L. I. 2007. Engaging Privacy and
Information Technology in a Digital Age, Washington, DC:
National Academies Press.
Walsham, G. 1996. “Ethical Theory, Codes of Ethics and IS
Practice,” Information Systems Journal (6:1), pp. 69-81.
Wang, S., Beatty, S. E., and Foxx, W. 2004. “Signaling the Trustworthiness of Small Online Retailers,” Journal of Interactive
Marketing (18:1), pp. 53-69.
Warren, C., and Laslett, B. 1977. “Privacy and Secrecy: Conceptual Comparison,” Journal of Social Issues (33:3), pp. 43-51.
Warren, S. D., and Brandeis, D. L. 1890. “The Right to Privacy,”
Harvard Law Review (4:5), pp. 193-220.
Weinstein, W. L. 1971. “The Private and the Free: A Conceptual
Inquiry,” in Privacy: Nomos XIII, J. R. Pennock and J. W.
Chapman (eds.), New York: Atherton Press, pp. 624-692.
Weisband, S. P., and Reinig, B. A. 1995. “Managing User Perceptions of Email Privacy,” Communications of the ACM (38:12),
pp. 40-47.
Werhane, P. H. 1994. “The Normative/Descriptive Distinction in
Methodologies of Business Ethics,” Business Ethics Quarterly
(4:2), pp. 175-180.
Westin, A. F. 1967. Privacy and Freedom, New York: Atheneum.
Westin, A. F. 2001. “Opinion Surveys: What Consumers Have to
Say About Information Privacy,” Prepared Witness Testimony,
The House Committee on Energy and Commerce, W. J. “Billy”
Tauzin, Chairman, May 8.
1014
MIS Quarterly Vol. 35 No. 4/December 2011
Westin, A. F. 2003. “Social and Political Dimensions of Privacy,”
Journal of Social Issues (59:2), pp. 431-453.
White, C., and Christy, D. 1987. “The Information Center Concept:
A Normative Model and a Study of Six Installations,” MIS
Quarterly (11:4), pp. 450-458.
White, T. B. 2004. “Consumer Disclosure and Disclosure Avoidance: A Motivational Framework,” Journal of Consumer Psychology (14:1/2), pp. 41-51.
Xu, H. 2007. “The Effects of Self-Construal and Perceived Control
on Privacy Concerns,” in Proceedings of the 28th International
Conference on Information Systems, Montréal, Canada,
December 9-12.
Xu, H., Dinev, T., Smith, H. J., and Hart, P. 2008. “Examining the
Formation of Individual’s Information Privacy Concerns:
Toward an Integrative View,” in Proceedings of 29th International Conference on Information Systems, Paris, France,
December 14-17.
Xu, H., and Gupta, S. 2009. “The Effects of Privacy Concerns and
Personal Innovativeness on Potential and Experienced
Customers’ Adoption of Location-Based Services “ Electronic
Markets—The International Journal on Networked Business
(19:2), pp. 137-140.
Xu, H., and Teo, H. H. 2004. “Alleviating Consumer’s Privacy
Concern in Location-Based Services: A Psychological Control
Perspective,” in Proceedings of the 25th International Conference on Information Systems, R. Agarwal, L. J. Kirsch, and J.
I. DeGross (eds.), Washington, DC, December 9-12, pp. 793-806.
Xu, H., Teo, H. H., and Tan, B. C. Y. 2005. “Predicting the Adoption of Location-Based Services: The Roles of Trust and Privacy
Risk,” in Proceedings of 26th International Conference on Information Systems, D. Avison, D. Galletta, and J. I. DeGross (eds.),
Las Vegas, NV, December 11-14, pp. 897-910.
Xu, H., Teo, H. H., Tan, B. C. Y., and Agarwal, R. 2010. “The
Role of Push-Pull Technology in Privacy Calculus: The Case of
Location-Based Services,” Journal of Management Information
Systems (26:3), pp. 137-176.
Younger Committee Report. 1972. “Report of the Committee on
Privacy,” Cmnd.5012 HMSO.
Zweig, D., and Webster, J. 2002. “Where Is the Line between
Benign and Invasive? An Examination of Psychological Barriers
to the Acceptance of Awareness Monitoring Systems,” Journal
of Organizational Behavior (23), pp. 605-633.
Zweig, D., and Webster, J. 2003. “Personality as a Moderator of
Monitoring Acceptance,” Computers in Human Behavior (19),
pp. 479-493.
Zwick, D., and Dholakia, N. 2004. “Whose Identity Is It Anyway?
Consumer Representation in the Age of Database Marketing,”
Journal of Macromarketing (24:1), pp. 31-43.
About the Authors
H. Jeff Smith is the George and Mildred Panuska Professor in
Business in the Department of Decision Sciences and Management
Information Systems at the Farmer School of Business, Miami
University, Oxford, Ohio. His research has appeared in California
Smith et al./Information Privacy Research
Management Review, Communications of the ACM, Harvard
Business Review, MIS Quarterly, MIT Sloan Management Review,
Organization Science, and other journals. He served as an associate
editor at MIS Quarterly. He holds a D.B.A. degree from Harvard
University; his research has examined privacy and ethical issues
associated with information technology and also organizational
impediments to successful implementation of information
technology applications.
Tamara Dinev is an associate professor and chair of the Department
of Information Technology and Operations Management (ITOM),
College of Business, Florida Atlantic University, Boca Raton,
Florida. She received her Ph.D. in Theoretical Physics in 1997.
Following several senior positions in information technology
companies, her interests migrated to management information
systems research and she joined the Florida Atlantic University
ITOM faculty in 2000. Her research interests include information
privacy, trust in online vendors, multicultural aspects of information
technology usage, and information security. She published in
several journals, including Information Systems Research, Journal
of the AIS, Journal of Strategic Information Systems, Communica-
tions of the ACM, International Journal of Electronic Commerce,
European Journal of Information Systems, Journal of Global Information Management, e-Service Journal, and Behaviour and
Information Technology. She has received numerous best paper
awards and nominations at major information system conferences.
Heng Xu is an assistant professor at the Pennsylvania State University where she is a recipient of the endowed PNC Technologies
Career Development Professorship. She received her Ph.D. in
Information Systems in 2005. She currently directs the Privacy
Assurance Lab (PAL), and serves as associate director of the Center
for Cyber-Security, Information Privacy and Trust (LIONS Center)
at Penn State. Her research projects deal with the conceptualization,
intervention, and design aspects of privacy and security. She is a
recipient of the Faculty Early Career Development (CAREER)
Award by the National Science Foundation (2010–2015). Her work
has been published in several journals, including Journal of
Management Information Systems, Information & Management,
Decision Support Systems, DATA BASE for Advances in Information
Systems, Electronic Markets, and Journal of Information Privacy
and Security.
MIS Quarterly Vol. 35 No. 4/December 2011
1015
1016
MIS Quarterly Vol. 35 No. 4/December 2011
THEORY AND REVIEW
INFORMATION PRIVACY RESEARCH:
AN INTERDISCIPLINARY REVIEW1
H. Jeff Smith
Department of Decision Sciences and Management Information Systems, Farmer School of Business,
Miami University, Oxford, OH 45056 U.S.A. {jeff.smith@muohio.edu}
Tamara Dinev
Department of Information Technology & Operations Management, College of Business,
Florida Atlantic University, Boca Raton, FL 33431 U.S.A. {tdinev@fau.edu}
Heng Xu
College of Information Sciences and Technology, Pennsylvania State University,
University Park, PA 16802 U.S.A. {hxu@ist.psu.edu}
Appendix A
Methodology Used to Identify Research Publications
The multidisciplinary nature of information privacy research poses great challenges, since many concepts of information privacy have only
been considered and developed through the lens of a particular discipline. It was our goal to conduct a multidisciplinary literature review.
Following the three-stage approach proposed by Webster and Watson (2002), our methodology for identifying information privacy publications
proceeded in three stages.
First, we performed a search spanning the information systems, social sciences, law, marketing, organization theory, and economics literatures.
We queried library catalogs (for books), journal databases (Proquest, Science Direct, and the JSTOR archive), and browsed the titles of articles
in leading journals in various disciplines. Browsed journals included ACM Transactions on Internet Technology, American Business Law
Journal, American Economic Review, European Journal of Information Systems, Information & Management, Information Society, Information
Systems Research, International Journal of Electronic Commerce, Journal of Consumer Affairs, Journal of Interactive Marketing, Journal of
Management Information Systems, Journal of Public Policy and Marketing, Journal of Social Issues, Journal of Strategic Information Systems,
Journal of the Association for Information Systems, MIS Quarterly, and Organization Science. We also browsed the proceedings of the
International Conference on Information Systems. We completed full-text electronic searches using keywords including “information privacy”
or “information privacy and technology.” These searches identified a total of 576 articles and 212 books from all disciplines. The titles and
abstracts of each article and the table of contents of each book were examined to evaluate whether inclusion was warranted (i.e., the article
appeared to be concerned with the topic of information privacy).This process provided 230 articles for in-depth review and coding.
Second, to broaden the search beyond the original set of journals and books, we reviewed the citations of the identified articles to determine
additional articles and books we should consider. By reviewing the frequency of book citations (as an indicator of the contribution to the field
of information privacy) across various articles in various disciplines, we also narrowed the book list to those most relevant for our review that
warrant further consideration, inclusion, and coding.
Finally, we used the Web of Science to identify additional candidate articles citing the key articles identified in the previous steps. A further
set of 96 articles was collected and coded. This systematic and comprehensive search resulted in a coding set of 320 information privacy
articles and 128 books and book sections provided in the “References” section of Appendix B.
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A1
Smith et al./Information Privacy Research
The goal of our content analysis regarding the definitional approaches to privacy (Table B4) was to identify the categories of definitions.
Guided by previous literature reviews (Bennett 1995; Margulis 1977; Waldo et al. 2007), we identified four subcategories of privacy
definitions, further grouped into two categories (value-based and cognate-based definitions), per the theoretical arguments presented in this
paper. In the next steps, two independent coders assigned 18 total keywords to the identified categories of our coding scheme (for example,
keywords for the value-based, privacy-as-a-right definition subcategory, the corresponding keywords that were identified are right, moral value”
and duty). Sixteen keywords were assigned to the same categories by both coders, and the rest were dropped from consideration. The
assignment of each paper to the corresponding category was done by the same two coders. The resulting inter-coder reliability was .92 (p-value
< .000), suggesting a high level of agreement between the coders (Landis and Koch 1977).
References
Bennett, C. J. 1995. The Political Economy of Privacy: A Review of the Literature, Hackensack, NJ: Center for Social and Legal Research.
Landis, J. R., and Koch, G. G. 1977. “The Measurement of Observer Agreement for Categorical Data,” Biometrics (33:1), pp. 159-174.
Margulis, S. T. 1977. “Conceptions of Privacy: Current Status and Next Steps,” Journal of Social Issues (33:3), pp. 5-21.
Waldo, J., Lin, H., and Millett, L. I. 2007. Engaging Privacy and Information Technology in a Digital Age, Washington, DC: National
Academies Press.
Webster, J., and Watson, R. T. 2002. “Analyzing the Past to Prepare for the Future: Writing a Literature Review,” MIS Quarterly (26:2), pp.
xiii-xxiii.
Appendix B
Information Privacy Research: An Interdisciplinary Review
Table B1. Descriptive Studies on General Privacy by Level of Analysis
Organizational
Societal Level
Individual Level
Group Level
Level
(International, National, Cross-Cultural)
Agre and Rotenberg 1998; Bahadur et al. 2002;
Berghel 2002; Berscheid 1977; Boncella 2001;
Bowers 1997; Burkert 1997; Caloyannides 2001;
Caloyannides and Caloyannides 2004; Cannon
2005; Cate 1997; Cavoukian and Tapscott 1997;
Cazier 2006; Cranor 2002a, 2003b; DeCew
1997; DePaulo et al. 2003; Fischer-Hübner
2001; Friedman 2004; George 2004; Ghosh
2001; Ghosh and Swaminatha 2001; Goldberg
2003; Goodwin 1991; Gottfried 1994; Gutwirth
2002; Hahn 2002; Henderson 2006; Hendricks
et al. 1990; Hirshleifer 1980; Hoffman 1973;
Holtzman 2006; Jennings and Fena 2000;
Kaufman et al. 2005; Kelvin 1973; Kent et al.
2003; Klopfer and Rubenstein 1977; Kobsa
2007; Koch and Moslein 2005; Langenderfer
and Linnhoff 2005; Luo 2002; Lyon and Zureik
1996; Margulis 1974, 1977a, 2003b; Martin
1973; Marx 1999, 2001; Miller 2002; Milne 2000;
Mizell 1998; Nowak and Phelps 1997; Peters
1999; Poindexter et al. 2006; Posner 1984;
Price et al. 2005; Raab and Bennett 1998;
Reiman 1984; Rubin 1988; Salomon 2003;
Schwartz 1997; Shapiro 1998; Sipior and Ward
1995; Solove 2004, 2006, 2008; Straub and
Collins 1990; Strickland and Hunt 2005; Turner
and Dasgupta 2003; Weisband and Reinig
1995; Westin 2003
Derlega and
Chaikin 1977;
Friedman 2004;
Laufer and
Wolfe 1977;
Warren and
Laslett 1977
Ariss 2002; Benassi
1999; Berenbeim 1990;
Cannon 2005; Culnan
2003; Culnan and Bies
2003; Decker 1989;
Dhillon and Moores
2001; Erbschloe and
Vacca 2001; Frye
2001; George 2004;
Greenaway and Chan
2005; Katos and
Adams 2005; Kayworth
et al. 2005; Kent et al.
2003; Mason 1986;
Nord et al. 2006;
Purcell and Fusaro
2000; Schein 1977;
Schwartz 1997; Sewell
and Barker 2001; Smith
1994; Smith 2004;
Thelwall and Stuart
2006; Walczuch et al.
1995; Walczuch and
Steeghs 2001; Winer
2001
International:
Agre and Rotenberg 1998; Caudill and Murphy 2000; EscuderoPascual and Hosein 2004; Kobsa 2002; Walczuch et al. 1995;
Walczuch and Steeghs 2001; Zwick and Dholakia 1999
National:
Alpert 2003; Beaney 1966; Bennett and Raab 1997, 2003;
Blanchette and Johnson 2002; Calvert 2000; Cate 1997; Chan
et al. 2005; Colomina 1994; Cranor 2002a; DeCew 1997;
Dickson 1998; Dill 1986; Etzioni 1999; Flaherty 1989, 1997;
Frackman et al. 2002; Froomkin 1999; Gandy 1993, 2003;
Gavison 1980; George 2004; Gerard 1980; Gottfried 1994;
Grupe 1995; Gutwirth 2002; Henderson 2006; Holden and Millett
2005; Jones 1974; Kenyon and Richardson 2006; Klang and
Murray 2005; Kling et al. 1999; Konvitz 1966; Lee 2005; Lin et
al. 2004; Livraghi and Monti 2002; Lyon 1994, 2001b; Lyon and
Zureik 1996; McCrohan 1989; McWhirter 1994; Metzger and
Docter 2003; Monmonier 2002; Moore 1984; Negley 1966;
Nicholson 2007; Nissenbaum 2004; Nock 1993; O’Brien 1979;
Penders 2004; Pennock and Chapman 1971; Petty 2000;
Posner 1981; Post 1989, 2000; Regan 1995, 2003; Rindfleisch
1997; Roboff and Charles 1998; Rubin 1988; Schachter 2003;
Schneier and Banisar 1997; Schoeman 1984; Shamis et al.
2005; Slough 1969; Smith 2001; Solove 2004, 2006; Stigler
1980; Thomas and Mauer 1997; Wacks 1993; Ware 1976;
Warren and Brandeis 1890; Young 1978
Cross-cultural:
Nakada and Tamura 2005; Roberts and Gregor 1971; Waldo et
al. 2007
A2
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
Table B2. Normative Studies on General Privacy by Level of Analysis
Individual Level
Bibas 1994; Brin 1998; Camp 1999; Campbell and
Carlson 2002; Clarke 1994a, 1999; Davies 1997;
Davison et al. 2003; Dhillon and Moores 2001;
Gelbord and Roelofsen 2002; Jones 1991; Jourard
1966; Klosek 2000; Lane 2002; Laudon 1996; Lee
2002; Marx 2003; McGinity 2000; Meeks 1997,
1999, 2003; Simmel 1971; Solove 2007; Spangler
et al. 2006; Stallman 2005; Weinstein 1971
Group Level
Organizational
Level
Clarke 1994b;
Grandison 2005; Greco
2001; Henderson and
Snyder 1999; Hubbartt
1998; Lane 2003;
Schultz 1988; Sipior et
al. 1998; Swire 2003
Societal Level
(International, National, Cross-Cultural)
Alderman and Kennedy 1997; Allen 2001; Barnes
1980; Benn 1971; Breckenridge 1970; Brenton
1964; Brill 1990; Brin 1998; Clarke 1988, 1994a,
1994b, 2001; Cohen 2001; Diffie and Landau 1998;
Flaherty 1997; Forer 1987; Garfinkel 2000; Grupe
et al. 2002; Hickson 1987; Hosein 2006; Kalven
1966; Karat et al. 2005; Kling 1995, 1996, 1997;
Kling and Allen 1996; Klosek 2007; LaMay 2003;
Laudon 1996; Linowes 1989; Lyon 2001a;
Madgwick and Smythe 1974; McLean 1995;
Melanson 2001; Mizell 1998; Neumann 2005;
Nissenbaum 1998; Packard 1964; Pember 1972;
Peters 1998; Podesta et al. 2004; Pottie 2004;
Raines 1974; Rosen 2000, 2003; Rosenfilld 1995;
Sewell and Barker 2001; Shils 1966; Sobel 1976;
Sopinka 1997; Spencer 2006; Stalder 2002a; Swire
1997, 1998, 1999, 2001, 2002, 2003; Sykes 1999;
Teich et al. 1999; Turow 2003; Volkman 2003;
Whitaker 1999
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A3
Smith et al./Information Privacy Research
A4
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A5
Smith et al./Information Privacy Research
A6
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
Table B5. Studies Related to Cross-Cultural Contexts
Cultural Dimensions
Individual Level
Group
Level
Organizational
Level
Societal/Cultural/International Level
Collectivism/
Individualism
(Hofstede 1980)
Bellman et al. 2004; Dinev et al.
2006a, 2006b; Milberg et al. 1995;
Milberg et al. 2000; Sagi et al. 2004
Bellman et al. 2004; Dinev et al. 2006a,
2006b; Milberg et al. 1995; Milberg et
al. 2000; Sagi et al. 2004
Power Distance Index
(Hofstede 1980)
Bellman et al. 2004; Dinev et al.
2006a,2006b; Milberg et al. 1995;
Milberg et al. 2000; Sagi et al. 2004
Bellman et al. 2004; Dinev et al. 2006a,
2006b; Milberg et al. 1995; Milberg et
al. 2000; Sagi et al. 2004
Masculinity/femininity
(Hofstede 1980)
Bellman et al. 2004; Dinev et al.
2006a, 2006b; Milberg et al. 2000;
Sagi et al. 2004
Bellman et al. 2004; Dinev et al. 2006a,
2006b; Milberg et al. 2000; Sagi et al.
2004
Uncertainty
Avoidance Index
(Hofstede 1980)
Bellman et al. 2004; Dinev et al.
2006a, 2006b; Milberg et al. 1995;
Milberg et al. 2000; Sagi et al. 2004
Bellman et al. 2004; Dinev et al. 2006a,
2006b; Milberg et al. 1995; Milberg et
al. 2000; Sagi et al. 2004
Long-Term Time
Orientation
(Hofstede 1980)
Dinev et al. 2006a, 2006b
Dinev et al. 2006a, 2006b
Perceptions of Space
(Hall and Hall 1990;
Laufer and Wolfe
1977)
Dinev et al. 2006a, 2006b; Zakaria et
al. 2003
Zakaria et al. 2003
Social capital/Trust
(Fukuyama 1999)
Dinev et al. 2006a, 2006b
Relative magnitude of
private and public
space (Trompenaars
1993)
Dinev et al. 2006a
Dinev et al. 2006a
Societal Role of
Privacy
Individual
Smith 2001
Altman 1977; Moore 1984; Newell
1998; Solove 2008
Altman
1977
Altman 1977; Klosek 2007; Moore
1984; Newell 1998; Roberts and Gregor
1971; Solove 2008
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A7
Smith et al./Information Privacy Research
A8
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A9
Smith et al./Information Privacy Research
A10
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A11
Smith et al./Information Privacy Research
A12
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A13
Smith et al./Information Privacy Research
Table B8. Studies Related to Political Context
Political
Individual Level
Group
Level
Organizational
Level
Societal Level
Privacy vs. Law
Enforcement (searches
and seizures)
Alderman and Kennedy 1997; Brin
1998; Forer 1987; McLean 1995;
McWhirter 1994; Raines 1994;
Rosen 2000
Alderman and Kennedy 1997;
Breckenridge 1970; Brenton 1964;
Brin 1998; Forer 1987; Hickson
1987; Madgwick and Smythe 1974;
McLean 1995; Packard 1965; Rosen
2000
Privacy and self (abortions and embryos,
right-to-die, access to
personal health records
by courts and insurance
companies
Alderman and Kennedy 1997; Allen
2001; Brill 1990; Linowes 1989;
McLean 1995; Mizell 1998; Sykes
1999
Alderman and Kennedy 1997; Brill
1990; McLean 1995
Privacy vs. Press
(private facts, exposure,
celebrity culture,
intrusion)
Alderman and Kennedy 1997; Lane
2002; McLean 1995; Pember 1972;
Raines 1974; Rosen 2000
Alderman and Kennedy 1997;
Calvert 2000; LaMay 2003; Lane
2002; McLean 1995; Rosen 2000
Government
Surveillance
Brin 1998; Linowes 1989; Mizell
1998
Brin 1998; Diffie and Landau 1998;
Flaherty 1989, 1997; Gandy 1993;
Garfinkel 2000; Klosen 2007; Lane
2002; Lyon 1994, 2001b; Melandon
2001; Mizell 1998; Monmonier 2002;
Nock 1993; Peters 1999; Podesta et
al. 2004; Raines 1974; Rosen 2000,
2003; Schneier and Banisar 1997;
Sobel 1976; Solove 2006; Sykes
1999; Whitaker 1999
References
Ackerman, M. 2004. “Privacy in Pervasive Environments: Next Generation Labeling Protocols,” Personal and Ubiquitous Computing (8:6),
pp. 430-439.
Ackerman, M., Cranor, L., and Reagle, J. 1999. “Privacy in E-Commerce: Examining User Scenarios and Privacy Preferences,” in
Proceeding of the 1st ACM Conference on Electronic Commerce, New York: ACM Press, pp. 1-8.
Acquisti, A. 2004. “Privacy in Electronic Commerce and the Economics of Immediate Gratification,” in Proceedings of the 5th ACM Electronic
Commerce Conference, New York: ACM Press, pp. 21-29.
Acquisti, A., and Grossklags, J. 2005a. “Privacy and Rationality in Individual Decision Making,” IEEE Security & Privacy (3:1), pp. 26-33.
Acquisti, A., and Grossklags, J. 2005b. “Uncertainty, Ambiguity and Privacy,” in Fourth Annual Workshop Economics and Information
Security, pp. 2-3.
Acquisti, A., and Varian, H. R. 2005. “Conditioning Prices on Purchase History,” Marketing Science (24:3), pp. 367-381.
Adams, A., and Blandford, A. 2005. “Bridging the Gap between Organizational and User Perspectives of Security in the Clinical Domain,”
International Journal of Human-Computer Studies (63:1/2), pp. 175-202.
Adar, E., Lukose, R., Sengupta, C., Tyler, J., and Good, N. 2003. “Shock: Aggregating Information While Preserving Privacy,” Information
Systems Frontiers (5:1), pp. 15-28.
Agre, P. E., and Rotenberg, M. 1998. Technology and Privacy: The New Landscape, Cambridge, MA: MIT Press.
Alderman, E., and Kennedy, C. 1997. The Right to Privacy, New York: Vintage Books.
Allen, A. L. 2001. “Is Privacy Now Possible? A Brief History of an Obsession,” Social Research (68:1), pp. 301-306.
Alpert, S. A. 2003. “Protecting Medical Privacy: Challenges in the Age of Genetic Information,” Journal of Social Issues (59:2), pp. 301-322.
Altman, I. 1974. “Privacy: A Conceptual Analysis,” in Man-Environment Interactions: Evaluations and Applications: Part 2, D. H. Carson
(ed.), Washington, DC: Environmental Design Research Association, pp. 3-28.
Altman, I. 1975. The Environment and Social Behavior: Privacy, Personal Space, Territory, and Crowding, Monterey, CA: Brooks/Cole
Publishing.
Altman, I. 1977. “Privacy Regulation: Culturally Universal or Culturally Specific?,” Journal of Social Issues (33:3), pp. 66-84.
A14
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
Anderson, S. E., and Gansneder, B. M. 1995. “Using Electronic Mail Surveys and Computer-Monitored Data for Studying Computer-Mediated
Communication Systems,” Social Science Computer Review (13), pp. 33-46.
Angst, M. C., and Agarwal, R. 2006. “Overcoming Personal Barriers to Adoption when Technology Enables Information to be Available to
Others,” in Proceedings of 27th International Conference on Information Systems, Milwaukee, WI, pp. 1331-1339.
Anton, A. I., Earp, J. B., He, Q., Stufflebeam, W., Bolchini, D., and Jensen, C. 2004. “Financial Privacy Policies and the Need for
Standardization,” IEEE Security and Privacy (2:2), pp. 36-45.
Ariss, S. S. 2002. “Computer Monitoring: Benefits and Pitfalls Facing Management,” Information & Management (39:7), pp. 553-558.
Awad, N. F., and Krishnan, M. S. 2006. “The Personalization Privacy Paradox: An Empirical Evaluation of Information Transparency and
the Willingness to be Profiled Online for Personalization,” MIS Quarterly (30:1), pp. 13-28.
Bacard, A. 1995. The Computer Privacy Handbook, Berkeley, CA: Peachpit Press.
Bahadur, G., Chan, W., and Weber, C. 2002. Privacy Defended: Protecting Yourself Online, Indianapolis, IN: Que.
Bapna, S., and Gangopadhyay, A. 2006. “A Wavelet-Based Approach to Preserve Privacy for Classification Mining,” Decision Sciences
(37:4), pp. 623-642.
Barnes, J. A. 1980. Who Should Know What? Social Science, Privacy, and Ethics, Cambridge, UK: Cambridge University Press.
Beaney, W. M. 1966. “The Right to Privacy and American Law,” Law and Contemporary Problems (31:2), pp. 253-271.
Belanger, F., Hiller, J. S., and Smith, W. J. 2002. “Trustworthiness in Electronic Commerce: The Role of Privacy, Security, and Site
Attributes,” Journal of Strategic Information Systems (11:3/4), pp. 245-270.
Bellman, S., Johnson, E. J., Kobrin, S. J., and Lohse, G. L. 2004. “International Differences in Information Privacy Concerns: A Global
Survey of Consumers,” Information Society (20:5), pp. 313-324.
Benassi, P. 1999. “TRUSTe: An Online Privacy Seal Program,” Communications of the ACM (42:2), pp. 56-59.
Benn, S. I. 1971. “Privacy, Freedom and Respect for Persons,” in Nomos XHI: Privacy, J. R. Pennock and J. W. Chapman (eds.), New York:
Atherton Press, pp. 56-70.
Bennett, C. J. 1992. Regulating Privacy: Data Protection and Public Policy in Europe and the United States, Ithaca, NY: Cornell University
Press.
Bennett, C. J., and Raab, C. D. 1997. “The Adequacy of Privacy: The European Union Data Protection Directive and the North American
Response,” Information Society (13:3), pp. 245-263.
Bennett, C. J., and Raab, C. D. 2003. The Governance of Privacy: Policy Instruments in Global Perspective, Burlington, VT: Ashgate,
Aldershot.
Berenbeim, R. 1990. Employee Privacy, New York: The Conference Board.
Berendt, B., Gunther, O., and Spiekermann, S. 2005. “Privacy in E-Commerce: Stated Preferences vs. Actual Behavior,” Communications
of the ACM (48:4), pp. 101-106.
Berghel, H. 2002. “Hijacking the Web,” Communications of the ACM (45:4), pp. 23-27.
Berkovsky, S., Eytani, Y., Kuflik, T., and Ricci, F. 20005. “Privacy-Enhanced Collaborative Filtering,” in PEP05, UM05 Workshop on
Privacy-Enhanced Personalization, Edinburgh, Scotland, pp. 75-83.
Berkovsky, S., Eytani, Y., Kuflik, T., and Ricci, F. 2006. “Hierarchical Neighborhood Topology for Privacy Enhanced Collaborative
Filtering,” in Proceedings of PEP06, CHI 2006 Workshop on Privacy-Enhanced Personalization, Montreal, Canada, pp. 6-13.
Berscheid, E. 1977. “Privacy: Hidden Variable in Experimental Social-Psychology,” Journal of Social Issues (33:3), pp. 85-101.
Bibas, S. A. 1994. “A Contractual Approach to Data Privacy,” Harvard Journal of Law & Public Policy (17:2), 1994, pp. 591-605.
Blanchette, J. F., and Johnson, D. G. 2002. “Data Retention and the Panoptic Society: The Social Benefits of Forgetfulness,” Information
Society (18:1), pp. 33-45.
Boncella, R. J. 2001. “Internet Privacy – At Home and at Work,” Communications of AIS (7:14).
Bonner, W. 2007. “Locating a Space for Ethics to Appear in Decision-Making: Privacy as an Exemplar,” Journal of Business Ethics (70:3),
pp. 221-234.
Bonner, W., and Chiasson, M. 2005. “If Fair Information Principles Are the Answer, What Was the Question? An Actor-Network Theory
Investigation of the Modern Constitution of Privacy,” Information and Organization (15:4), pp. 267-293.
Bowers, D. 1997. “Privacy Online,” Marketing Research (9:3), pp. 37-38.
Branscomb, A. W. 1994. Who Owns Information? From Privacy to Public Access, New York: Basic Books.241 p.
Breckenridge, A. C. 1970. The Right to Privacy, Lincoln, NE: University of Nebraska Press.
Brenton, M. 1964. The Privacy Invaders, New York: Coward McCann.
Brill, A. 1990. Nobody’s Business: Paradoxes of Privacy, Reading, MA: Addison-Wesley.
Brin, D. 1998. The Transparent Society: Freedom Vs. Privacy in a City of Glass Houses, New York: Basic Books.
Buchanan, T., Paine, C., Joinson, A. N., and Reips, U. 2007. “Development of Measures of Online Privacy Concern and Protection for Use
on the Internet,” Journal of the American Society for Information Science and Technology (58:2), pp. 157-165.
Burkert, H. 1997. “Privacy-Enhancing Technologies: Typology, Critique, Vision,” in Technology and Privacy: The New Landscape, P. E.
Agre and M. Rotenberg (eds.), Boston: MIT Press, pp. 126-143.
Caloyannides, M. A. 2001. Computer Forensics and Privacy, Boston: Artech House.
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A15
Smith et al./Information Privacy Research
Caloyannides, M. A., and Caloyannides, M. A. 2004. Privacy Protection and Computer Forensics (2nd ed.), Boston: Artech House.
Calvert, C. 2000. Voyeur Nation: Media, Privacy, and Peering in Modern Culture, Boulder, CO: Westview Press.
Calzolari, G., and Pavan, A. 2006. “On the Optimality of Privacy in Sequential Contracting,” Journal of Economic Theory (130:1), pp.
168-204.
Camp, L. J. 1999. “Web Security and Privacy: An American Perspective,” Information Society (15:4), pp. 249-256.
Campbell, A. J. 1997. “Relationship Marketing in Consumer Markets,” Journal of Direct Marketing (11:3), pp. 44-57.
Campbell, J. E., and Carlson, M. 2002. “Panopticon.com: Online Surveillance and the Commodification of Privacy,” Journal of Broadcasting
& Electronic Media (46:4), pp. 586-606.
Cannon, J. C. 2005. Privacy: What Developers and IT Professionals Should Know, Boston: Addison-Wesley.
Canny, J. 2002. “Collaborative Filtering with Privacy,” in Proceedings of the IEEE Conference on Security and Privacy, Oakland, CA, pp.
45-57.
Cate, F. H. 1997. Privacy in the Information Age, Washington, DC: Brookings Institution Press.
Caudill, E. M., and Murphy, P. E. 2000. “Consumer Online Privacy: Legal and Ethical Issues,” Journal of Public Policy & Marketing (19:1),
pp. 7-19.
Cavoukian, A., and Tapscott, D. 1997. Who Knows: Safeguarding Your Privacy in a Networked World, New York: McGraw-Hill.
Cazier, J. A. 2006. Value Congruence and Trust Online: Their Impact on Privacy and Price Premiums, Youngstown, NY: Cambria Press.
Chan, Y. E., Culnan, M. J., Greenaway, K., Laden, G., Levin, T., and Smith, H. J. 2005. “Information Privacy: Management, Marketplace,
and Legal Challenges,” Communications of the AIS (16:1), pp. 593-634.
Chellappa, R. K. 2008. “Consumers’ Trust in Electronic Commerce Transactions: The Role of Perceived Privacy and Perceived Security,”
unpublished manuscript, Emory University, Atlanta, GA.
Chellappa, R. K., and Shivendu, S. 2008. “An Economic Model of Privacy: A Property Rights Approach to Regulatory Choices for Online
Personalization “ Journal of Management Information Systems (24:3), pp. 193-225.
Chellappa, R. K., and Sin, R. 2005. “Personalization Versus Privacy: An Empirical Examination of the Online Consumer’s Dilemma,”
Information Technology and Management (6:2), pp. 181-202.
Cingil, I., Dogac, A., and Azgin, A. 2000. “A Broader Approach to Personalization: Establishing an Architecture for Providing Automatically
Generated, Machine Processable, Dynamic User Profiles to Web Servers While Confirming to Users’ Privacy Preferences,” Communications
of the ACM (43:8), pp. 136-141.
Clarke, R. 1988. “Information Technology and Dataveillance,” Communications of the ACM (31:5), pp. 498-512.
Clarke, R. 1994a. “The Digital Persona and its Application to Data Surveillance,” The Information Society (10:2), pp. 77-92.
Clarke, R. 1994b. “Human Identification in Information Systems: Management Challenges and Public Policy Issues,” Information Technology
& People (7:4), pp. 6-37.
Clarke, R. 1999. “Internet Privacy Concerns Confirm the Case for Intervention,” Communications of the ACM (42:2), pp. 60-67.
Clarke, R. 2001. “Person Location and Person Tracking: Technologies, Risks and Policy Implications,” Information Technology & People
(14:2), pp. 206-231.
Cohen, J. E. 2001. “Privacy, Ideology, and Technology: A Response to Jeffrey Rosen,” Georgetown Law Journal (89), p 2029.
Colomina, B. 1994. Privacy and Publicity: Modern Architecture as Mass Media, Cambridge, MA: MIT Press.
Couture, B., and Kent, T. 2004. The Private, the Public, and the Published: Reconciling Private Lives and Public Rhetoric, Logan, UT: Utah
State University Press.
Cranor, L. F. 1999. “Internet Privacy – Introduction,” Communications of the ACM (42:2), pp. 28-31.
Cranor, L. F. 2001a. Web Privacy with P3P, Sebastopol, CA: O’Reilly.
Cranor, L. F. 2001b. “Ten Years of Computers, Freedom, and Privacy,” Information Society (18:3), pp. 151-151.
Cranor, L. F. 2003a. “‘I Didn’t Buy it for Myself’: Privacy and Ecommerce Personalization,” in Proceedings of the ACM Workshop on
Privacy in the Electronic Society, Washington, DC, October 30.
Cranor, L. F. 2003b. “The Role of Privacy Enhancing Technologies,” in Considering Consumer Privacy: A Resource for Policymakers and
Practitioners, Washington, DC: The Center for Democracy & Technology, pp. 80-83.
Cranor, L. F., Guduru, P., and Arjula, M. 2006. “User Interfaces for Privacy Agents,” ACM Transactions on Computer-Human Interaction
(13:2), pp. 135-178.
Culnan, M. J. 1993. “‘How Did They Get My Name’? An Exploratory Investigation of Consumer Attitudes toward Secondary Information
Use,” MIS Quarterly (17:3), pp. 341-364.
Culnan, M. J. 1995. “Consumer Awareness of Name Removal Procedures: Implication for Direct Marketing,” Journal of Interactive
Marketing (9), pp. 10-19.
Culnan, M. J. 2000. “Protecting Privacy Online: Is Self-Regulation Working?,” Journal of Public Policy and Marketing (19:1), pp. 20-26.
Culnan, M. J. 2003. “Consumer Privacy, Technology and Policy,” in Computers in Society: Privacy, Ethics and the Internet, J. F. George
(ed.), Upper Saddle River, NJ: Pearson/Prentice Hall, pp. 171-183.
Culnan, M. J. 2005. “Giant Food and Elensys: Looking Out for Customer Gross Privacy Invasion?,” Communications of the AIS (16:1), pp.
317-328.
A16
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
Culnan, M. J., and Armstrong, P. K. 1999. “Information Privacy Concerns, Procedural Fairness and Impersonal Trust: An Empirical
Investigation,” Organization Science (10:1), pp. 104-115.
Culnan, M. J., and Bies, R.J. 2003. “Consumer Privacy: Balancing Economic and Justice Considerations,” Journal of Social Issues (59:2),
pp. 323-342.
Davies, S. G. 1997. “Re-Engineering the Right to Privacy: How Privacy Has Been Transformed from a Right to a Commodity,” in Technology
and Privacy: The New Landscape, P. E. Agre and M. Rotenberg (eds.), Cambridge, MA: MIT Press, pp. 143-165.
Davis, D. W., and Silver, B. D. 2004. “Civil Liberties vs. Security: Public Opinion in the Context of the Terrorist Attacks on America,”
American Journal of Political Science (48:1), pp. 28-46.
Davison, R. M., Clarke, Langford, D., Kuo, F.-Y., and Smith, H. J. 2003. “Information Privacy in a Globally Networked Society: Implications
for IS Research,” Communications of the AIS (12:1), pp. 341-365.
DeCew, J. W. 1997. In Pursuit of Privacy: Law, Ethics, and the Rise of Technology, Ithaca, NY: Cornell University Press.
Decker, K. H. 1989. A Manager’s Guide to Employee Privacy: Laws, Policies and Procedures, New York: Wiley.
Dennis, C. 1995. “Who Owns Information? From Privacy to Public Access,” Journal of the American Society for Information Science (46:9),
pp. 714-716.
DePaulo, B. M., Wetzel, C., Sternglanz, R. W., and Wilson, M. J. W. 2003. “Verbal and Nonverbal Dynamics of Privacy, Secrecy, and
Deceit,” Journal of Social Issues (59:2), pp. 391-410.
Derlega, V. J. 1993. Self-Disclosure, Newbury Park, CA: Sage Publications.
Derlega, V. J., and Chaikin, A. L. 1977. “Privacy and Self-Disclosure in Social Relationships,” Journal of Social Issues (33:3), pp. 102-115.
Dhillon, G. S., and Moores, T. 2001. “Internet privacy: Interpreting Key Issues,” Information Resources Management Journal (14:4), p 33.
Di Pietro, R., and Mancini, L. V. 2003. “Security and Privacy Issues of Handheld and Wearable Wireless Devices,” Communications of the
ACM (46:9), pp. 74-79.
Dickson, D. T. 1998. Confidentiality and Privacy in Social Work: A Guide to the Law for Practitioners and Students, New York: Free Press.
Diffie, W., and Landau, S. E. 1998. Privacy on the Line: The Politics of Wiretapping and Encryption, Cambridge, MA: MIT Press.
Dill, B. 1986. The Journalist’s Handbook on Libel and Privacy, New York: Free Press.
Dinev, T., Bellotto, M., Hart, P., Russo, V., Serra, I., and Colautti, C. 2006a. “Internet Users’ Privacy Concerns and Beliefs About Government
Surveillance: An Exploratory Study of Differences between Italy and the United States,” Journal of Global Information Management (14:4),
pp. 57-93.
Dinev, T., Bellotto, M., Hart, P., Russo, V., Serra, I., and Colautti, C. 2006b. “Privacy Calculus Model in E-Commerce – A Study of Italy
and the United States,” European Journal of Information Systems (15:4), pp. 389-402.
Dinev, T., and Hart, P. 2004. “Internet Privacy Concerns and Their Antecedents – Measurement Validity and a Regression Model,” Behavior
and Information Technology (23:6), pp. 413-423.
Dinev, T., and Hart, P. 2006. “An Extended Privacy Calculus Model for E-Commerce Transactions,” Information Systems Research (17:1),
pp. 61-80.
Dinev, T., Hart, P., and Mullen, M. R. 2008. “Internet Privacy Concerns and Beliefs About Government Surveillance – An Empirical
Investigation,” Journal of Strategic Information Systems (17:3), pp. 214-233.
Dolnicar, S., and Jordaan, Y. 2007. “A Market-Oriented Approach to Responsibly Managing Information Privacy Concerns in Direct
Marketing,” Journal of Advertising (36:2), pp. 123-149.
Dommeyer, C. J., and Gross, B. L. 2003. “What Consumers Know and What They Do: An Investigation of Consumer Knowledge, Awareness,
and Use of Privacy Protection Strategies,” Journal of Interactive Marketing (17), pp. 34-51.
Earp, J. B., Anton, A. I., Aiman-Smith, L., and Stufflebeam, W. H. 2005. “Examining Internet Privacy Policies Within the Context of User
Privacy Values,” IEEE Transactions on Engineering Management (52:2), pp. 227-237.
Earp, J. B., and Baumer, D. 2003. “Innovative Web Use to Learn About Consumer Behavior and Online Privacy,” Communications of the
ACM (46:4), pp. 81-83.
Earp, J. B., and Payton, F. C. 2006. “Information Privacy in the Service Sector: An Exploratory Study of Health Care and Banking
Professionals,” Journal of Organizational Computing and Electronic Commerce (16:2), pp. 105-122.
Eddy, R. E., Stone, L. D., and Stone-Romero, F. E. 1999. “The Effects of Information Management Polices on Reactions to Human Resource
Information Systems: An Integration of Privacy and Procedural Justice Perspectives,” Personnel Psychology (52), pp. 335-358.
Enzmann, M., and Schneider, M. 2005. “Improving Customer Retention in E-Commerce through a Secure and Privacy-Enhanced Loyalty
System,” Information Systems Frontiers (7:4/5), pp. 359-370.
Eoyang, C. K. 1974. “Effects of Group Size and Privacy in Residential Crowding,” Journal of Personality and Social Psychology (30:3), pp.
389-392.
Erbschloe, M., and Vacca, J. R. 2001. Net Privacy: A Guide to Developing and Implementing an Ironclad eBusiness Privacy Plan, New York:
McGraw-Hill.
Escudero-Pascual, A., and Hosein, I. 2004. “Questioning Lawful Access to Traffic Data,” Communications of the ACM (47:3), pp. 77-82.
Etzioni, A. 1999. The Limits of Privacy, New York: Basic Books.
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A17
Smith et al./Information Privacy Research
Faja, S., and Trimi, S. 2006. “Influence of the Web Vendor’ s Interventions on Privacy-Related Behaviors in E-commerce,” Communications
of the AIS (17:1), pp. 593-634.
Fischer-Hübner, S. 2001. IT-Security and Privacy: Design and Use of Privacy-Enhancing Security Mechanisms, New York: Springer.
Flaherty, D. H. 1989. Protecting Privacy in Surveillance Societies: The Federal Republic of Germany, Sweden, France, Canada, and the
United States, Chapel Hill, NC: University of North Carolina Press.
Flaherty, D. H. 1997. “Controlling Surveillance: Can Privacy Protection Be Made Effective?,” in Technology and Privacy: The New
Landscape, P. E. Agre and M. Rotenberg (eds.), Cambridge, MA: MIT Press, pp. 167-192.
Forer, L. G. 1987. A Chilling Effect: The Mounting Threat of Libel and Invasion of Privacy Actions to the First Amendment, New York:
Norton.
Frackman, A., Ray, C., and Martin, R. C. 2002. Internet and Online Privacy: A Legal and Business Guide, New York: ALM Publishing.
Freedman, W. 1987. The Right of Privacy in the Computer Age, New York: Quorum Books.
Friedman, L. M. 2004. Private Lives: Families, Individuals, and the Law, Cambridge, MA: Harvard University Press.
Froomkin, A. M. 1999. “Legal Issues in Anonymity and Pseudonymity,” Information Society (15:2), pp. 113-127.
Frye, C. 2001. Privacy-Enhanced Business: Adapting to the Online Environment, Westport, CT: Quorum Books.
Fukuyama, F. 1995. Trust: Social Virtues and the Creation of Prosperity, New York: The Free Press.
Fusilier, M. R., and Hoyer, W. D. 1980. “Variables Affecting Perceptions of Invasion of Privacy in a Personnel Selection Situation,” Journal
of Applied Psychology (65:5), pp. 623-626.
Gabber, E., Gibbons, P. B., Matias, Y., and Mayer, A. 1997. “How to Make Personalized Web Browsing Simple, Secure, and Anonymous,”
in Financial Cryptography: First International Conference, FC’97, R. Hirschfeld (ed.), Berlin: Springer Verlag, pp. 17-31.
Gandy, O. H. 1993. The Panoptic Sort: A Political Economy of Personal Information, Boulder, CO: Westview Press.
Gandy, O. H. 2003. “Public Opinion Surveys and the Formation of Privacy Policy,” Journal of Social Issues (59:2), pp. 283-299.
Garfinkel, R., Gopal, R., and Goes, P. 2002. “Privacy Protection of Binary Confidential Data Against Deterministic, Stochastic, and Insider
Threat,” Management Science (48:6), pp. 749-764.
Garfinkel, S. 2000. Database Nation: The Death of Privacy in the 21st Century, Sebastopol, CA: O’Reilly.
Gavison, R. 1980. “Privacy and the Limits of Law,” The Yale Law Journal (89:3), pp. 421-471.
Gelbord, B., and Roelofsen, G. 2002. “New Surveillance Techniques Raise Privacy Concerns,” Communications of the ACM (45:11), pp.
23-24.
George, J. F. 2004. Computers in Society: Privacy, Ethics, and the Internet, Upper Saddle River, NJ: Pearson/Prentice Hall.
Gerard, S. 1980. “A Progress Report on Information Privacy and Data Security,” Journal of the American Society for Information Science
(31:2), pp. 75-83.
Ghosh, A. K. 2001. E-Commerce Security and Privacy, Boston: Kluwer Academic Publishers.
Ghosh, A. K., and Swaminatha, T. M. 2001. “Software Security and Privacy Risks in Mobile E-Commerce,” Communications of the ACM
(44:2), pp. 51-57.
Gideon, J., Cranor, L., Egelman, S., and Acquisti, A. 2006. “Power Strips, Prophylactics, and Privacy, Oh My! ,” in Proceedings of the Second
Symposium on Usable Privacy and Security, Pittsburgh, PA, pp. 133-144.
Goldberg, I. A. 2003. “Privacy-Enhancing Technologies for the Internet II: Five Years Later,” in Proceedings of the Second International
Conference on Privacy Enhancing Technologies, R. Dingledine and P. Syverson (eds.), Berlin: Springer Verlag, pp. 1-12.
Goodwin, C. 1991. “Privacy: Recognition of a Consumer Right,” Journal of Public Policy and Marketing (10:1), pp. 149-166.
Gottfried, T. 1994. Privacy: Individual Right v. Social Needs, Brookfield, CT: Millbrook Press.
Gould, J. P. 1980. “Privacy and the Economics of Information,” The Journal of Legal Studies (9:4), pp. 827-842.
Grandison, T. 2005. “Don’t Trade Privacy for Security,” Communications of the ACM (48:2), pp. 13-13.
Greco, J. 2001. “Privacy—Whose Right Is It Anyhow?,” The Journal of Business Strategy (22:1), pp. 32-35.
Greenaway, K. E., and Chan, Y. E. 2005. “Theoretical Explanations for Firms’ Information Privacy Behaviors,” Journal of the AIS (6:6), pp.
171-198.
Greenberg, C. I., and Firestone, I. J. 1977. “Compensatory Responses to Crowding – Effects of Personal-Space Intrusion and Privacy
Reduction,” Journal of Personality and Social Psychology (35:9), pp. 637-644.
Grupe, F. H. 1995. “Commercializing Public Information – A Critical Issue for Governmental Is Professionals,” Information & Management
(28:4), pp. 229-241.
Grupe, F. H., Kuechler, W., and Sweeney, S. 2002. “Dealing with Data Privacy Protection: An Issue for the 21st Century,” Information
Systems Management (19:4), pp. 61-70.
Gutwirth, S. 2002. Privacy and the Information Age, Lanham, MD: Rowman & Littlefield Publishers.
Hahn, H. 2002. Internet Insecurity: Why Your Privacy, Security, and Safety Are at Risk and What You Can Do About It, Upper Saddle River,
NJ: Prentice Hall.
Hann, I.-H., Hui, K. L., Lee, T., and Png, I. P. L. 2002. “Online Information Privacy: Measuring the Cost-Benefit Tradeoff,” in Proceedings
of the 23rd International Conference on Information Systems, L. Applegate, R. D. Galliers, and J. I. DeGross (eds.), Barcelona, Spain, pp.
1-10.
A18
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
Hann, I.-H., Hui, K.-L., Lee, S. Y. T., and Png, I. P. L. 2007. “Analyzing Online Information Privacy Concerns: An Information Processing
Theory Approach,” Journal of Management Information Systems (24:2), pp. 13-42.
Hann, I.-H., Hui, K.-L., Lee, S. Y. T., and Png, I. P. L. 2008. “Overcoming Online Information Privacy Concerns: An Information-Processing
Theory Approach,” Journal of Management Information Systems (24:2), pp. 13-42.
Henderson, H. 2006. Privacy in the Information Age (Rev. ed.), New York: Facts On File.
Henderson, S. C., and Snyder, C. A. 1999. “Personal Information Privacy: Implications for MIS Managers,” Information & Management
(36:4), pp. 213-220.
Hendricks, E., Hayden, T., Novik, J., and Hayden, T. 1990. Your Right to Privacy: A Basic Guide to Legal Rights in an Information Society
(2nd ed.), Carbondale, IL: Southern Illinois University Press.
Hickson, R. F. 1987. Privacy in a Public Society: Human Rights in Conflict, New York: Oxford University Press.
Hine, C., and Eve, J. 1998. “Privacy in the Marketplace,” Information Society (14:4), pp. 253-262.
Hinnant, C. C., and O’Looney, J. A. 2003. “Examining Pre-adoption Interest in Online Innovations: An Exploratory Study of e-Service
Personalization in the Public Sector,” IEEE Transactions on Engineering Management (50:4), pp. 436-447.
Hirshleifer, J. 1980. “Privacy: Its Origin, Function, and Future,” The Journal of Legal Studies (9:4), pp. 649-664.
Hoffman, D. L., Novak, T. P., and Peralta, M. A. 1999a. “Building Consumer Trust Online,” Communications of the ACM (42:4), pp. 80-85.
Hoffman, D. L., Novak, T. P., and Peralta, M. A. 1999b. “Information Privacy in the Marketspace: Implications for the Commercial Uses
of Anonymity on the Web,” Information Society (15:2), pp. 129-139.
Hoffman, L. J. 1973. Security and Privacy in Computer Systems, Los Angeles: Melville Publishing Company.
Hofstede, G. 1980. Culture’s Consequences: International Differences in Work-Related Values, Beverly Hills, CA: Sage Publications.
Holden, S. H., and Millett, L. I. 2005. “Authentication, Privacy, and the Federal E-Government,” Information Society (21:5), pp. 367-377.
Holtzman, D. H. 2006. Privacy Lost: How Technology Is Endangering Your Privacy, San Francisco: Jossey-Bass, San Francisco.
Hong, T., McLaughlin, M., Pryor, L., Beaudoin, C., and Grabowicz, P. 2005. “Internet Privacy Practices of News Media and Implications
for Online Journalism,” Journalism Studies (6:1), pp. 15-28.
Hosein, I. 2006. “Knowledge Power: Intellectual Property, Information, and Privacy,” Information Society (22:5), pp. 367-368.
Hsu, S. 2003. “Internet Privacy and Security: A Startup’s Perspective Information Dynamics in the Networked Society, April 4-5, 2002,”
Information Systems Frontiers (5:1), pp. 9-13.
Hubbartt, W. S. 1998. The New Battle over Workplace Privacy: How Far Can Management Go? What Rights Do Employees Have? Safe
Practices to Minimize Conflict, Confusion, and Litigation, New York: AMACOM.
Huberman, B. A., Adar, E., and Fine, L. R. 2005. “Valuating Privacy,” in Proceedings of the Fourth Workshop on the Economics of
Information Security, Cambridge, MA, October 6.
Hui, K.-L., Tan, B. C. Y., and Goh, C.-Y. 2006. “Online Information Disclosure: Motivators and Measurements,” ACM Transactions on
Internet Technology (6), pp. 415 - 441.
Hui, K.-L., and Png, I. P. L. 2005. “Economics of Privacy,” Chapter 8 in Economics and Information Systems, Volume 1, T. Hendershott (ed.),
Amsterdam: Elsevier.
Hui, K- L., Teo, H. H., and Lee, S. Y. T. 2007. “The Value of Privacy Assurance: An Exploratory Field Experiment,” MIS Quarterly (31:1),
pp. 19-33.
Hung, P. C. K., Chiu, D. K. W., Fung, W. W., Cheung, W. K., Wong, R., Choi, S. P. M., Kafeza, E., Kwok, J., Pun, J. C. C., and Cheng, V.
S. Y. 2007. “End-to-End Privacy Control in Service Outsourcing of Human Intensive Processes: A Multi-Layered Web Service Integration
Approach,” Information Systems Frontiers (9:1), pp. 85-101.
Ishitani, L., Almeida, V., and Wagner, M. 2003. “Masks: Bringing Anonymity and Personalization Together,” IEEE Security & Privacy
Magazine (1:3), pp. 18-23.
Jackson, L. A., von Eye, A., Barbatsis, G., Biocca, F., Zhao, Y., and Fitzgerald, H. E. 2003. “Internet Attitudes and Internet Use: Some
Surprising Findings from the HomeNetToo project,” International Journal of Human-Computer Studies (59:3), pp. 355-382.
Jennings, C., and Fena, L. 2000. The Hundredth Window: Protecting Your Privacy and Security in the Age of the Internet, New York: Free
Press.
Jensen, C., Potts, C., and Jensen, C. 2005. “Privacy Practices of Internet Users: Self-Reports Versus Observed Behavior,” International
Journal of Human-Computer Studies (63), pp. 203–227.
Jiang, X. D., and Landay, A. J. 2002. “Modeling Privacy Control in Context-Aware Systems,” IEEE Pervasive Computing (1:3), pp. 59-63.
Johnson, C. A. 1974. “Privacy as Personal Control,” in Man-Environment Interactions: Evaluations and Applications: Part 2, D. H. Carson
(ed.), Washington, DC: Environmental Design Research Association, pp. 83-100.
Jones, M. 1974. Privacy, Newton Abbot, UK: David & Charles Publishing.
Jones, M. G. 1991. “Privacy: A Significant Marketing Issue for the 1990s,” Journal of Public Policy and Marketing (10:1), pp. 133-148.
Jourard, S. M. 1966. “Some Psychological Aspects of Privacy,” Law and Contemporary Problems (31:2), pp. 307-318.
Juang, W. S., Lei, C. L., and Liaw, H. T. 2002. “Privacy and Anonymity Protection with Blind Threshold Signatures,” International Journal
of Electronic Commerce (7:2), pp. 143-157.
Kalven Jr., H. 1966. “Privacy in Tort Law: Were Warren and Brandeis Wrong?,” Law and Contemporary Problems (31:2), pp. 326-341.
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A19
Smith et al./Information Privacy Research
Kalvenes, J., and Basu, A. 2006. “Design of Robust Business-to-Business Electronic Marketplaces with Guaranteed Privacy,” Management
Science (52:11), pp. 1721-1736.
Karat, C. M., Brodie, C., and Karat, J. 2006. “Usable Privacy and Security for Personal Information Management,” Communications of the
ACM (49:1), pp. 56-57.
Karat, J., Karat, C. M., Brodie, C., and Feng, J. J. 2005. “Privacy in Information Technology: Designing to Enable Privacy Policy
Management in Organizations,” International Journal of Human-Computer Studies (63:1/2), pp. 153-174.
Katos, V., and Adams, C. 2005. “Modeling Corporate Wireless Security and Privacy,” Journal of Strategic Information Systems (14:3), pp.
307-321.
Kaufman, J. H., Edlund, S., Ford, D. A., and Powers, C. 2005. “The Social Contract Core,” Electronic Commerce Research (5:1), pp. 141-165.
Kayworth, T., Brocato, L., and Whitten, D. 2005. “What Is a Chief Privacy Officer? An Analysis Based on Mintzberg’s Taxonomy of
Managerial Roles,” Communications of the AIS (16:1), pp. 110-126.
Kelvin, P. 1973. “A Social Psychological Examination of Privacy,” British Journal of Social and Clinical Psychology (12), pp. 248-261.
Kent, S. T., Millett, L. I., National Research Council (U.S.) Committee on Authentication Technologies and Their Privacy Implications, and
National Academies Press (U.S.). 2003. Who Goes There? Authentication through the Lens of Privacy, Washington, DC: National
Academies Press.
Kenyon, A. T., and Richardson, M. 2006. New Dimensions in Privacy Law: International and Comparative Perspectives, Cambridge, UK:
Cambridge University Press.
Klang, M., and Murray, A. 2005. Human Rights in the Digital Age, London: Glasshouse Press.
Kling, R. 1995. “Information Technologies and the Shifting Balance between Privacy and Social Control,” Computerization and Controversy:
Value Conflicts and Social Choices (2nd ed.), R. Kling (ed.), San Diego, CA: Academic Press, pp. 614-636.
Kling, R. 1996. Computerization and Controversy: Value Conflicts and Social Choices (2nd ed.), San Diego: Academic Press.
Kling, R. 1997. “Special Issue: Free Speech and Privacy in Canada’s Information Society,” Information Society (13:2), pp. R3-R4.
Kling, R., and Allen, J. P. 1996. “How the Marriage of Management and Computing Intensifies the Struggle for Personal Privacy,” in New
Technology, Surveillance and Social Control, D. Lyon and E. Zureik (eds.), Minneapolis, MN: University of Minnesota Press, pp. 104-131.
Kling, R., Lee, Y. C., Teich, A., and Frankel, M. S. 1999. “Assessing Anonymous Communication on the Internet: Policy Deliberations,”
Information Society (15:2), pp. 79-90.
Klopfer, P. H., and Rubenstein, D. I. 1977. “The Concept Privacy and its Biological Basis,” Journal of Social Issues (33:3), pp. 52-65.
Klosek, J. 2000. Data Privacy in the Information Age, Westport, CT: Quorum Books.
Klosek, J. 2007. The War on Privacy, Westport, CT: Praeger Publishers.
Kluger, A. N., and Tikochinsky, J. 2001. “The Error of Accepting the ‘Theoretical’ Null Hypothesis: The Rise, Fall, and Resurrection of
Commonsense Hypotheses in Psychology,” Psychological Bulletin (127:3), pp. 408-423.
Kobsa, A. 2001. “Tailoring Privacy to Users’ Needs (Invited Keynote),” in User Modeling 2001: 8th International Conference, M. Bauer,
P. J. Gmytrasiewicz, and J. Vassileva (eds.), Berlin: Springer Verlag, pp. 303-313.
Kobsa, A. 2002. “Personalized Hypermedia and International Privacy,” Communications of the ACM (45:5), pp. 64-67.
Kobsa, A. 2007. “Privacy-Enhanced Personalization,” Communications of the ACM (50:8), pp. 24-33.
Kobsa, A., and Cranor, L. (eds.). 2005. Proceedings of UM’05: 10th International Conference on User Modeling, Workshop 9: Privacy
Enhanced Personalization, Edinburgh, UK, July 24-29 (http://www.isr.uci.edu/pep05/papers/w9-proceedings.pdf).
Kobsa, A., Koenemann, J., and Pohl, W. 2001. “Personalized Hypermedia Presentation Techniques for Improving Customer Relationships,”
The Knowledge Engineering Review (16), pp. 111-155.
Kobsa, A., and Schreck, J. 2003. “Privacy through Pseudonymity in User-Adaptive Systems,” ACM Transactions on Internet Technology (3),
pp. 149-183.
Koch, M., and Moslein, K. M. 2005. “Identities Management for E-Commerce and Collaboration Applications,” International Journal of
Electronic Commerce (9:3), pp. 11-29.
Konvitz, M. R. 1966. “Privacy and the Law: A Philosophical Prelude,” Law and Contemporary Problems (31:2), pp. 272-280.
LaMay, C. L. 2003. Journalism and the Debate Over Privacy, Mahwah, NJ: Lawrence Erlbaum Associates.
Lane, C. A. 2002. Naked in Cyberspace: How to Find Personal Information Online (2nd ed.), Medford, NJ: CyberAge Books/Information
Today.
Lane, F. S. 2003. The Naked Employee: How Technology Is Compromising Workplace Privacy, New York: AMACOM.
Langenderfer, J., and Linnhoff, S. 2005. “The Emergence of Biometrics and its Effect on Consumers,” Journal of Consumer Affairs (39:2),
pp. 314-338.
Lankshear, G., and Mason, D. 2001. “Technology and Ethical Dilemmas in a Medical Setting: Privacy, Professional Autonomy, Life and
Death,” Ethics and Information Technology (3:3), pp. 223-233.
LaRose, R., and Rifon, N. 2006. “Your Privacy Is Assured—of Being Disturbed: Comparing Web Sites with and Without Privacy Seals,”
New Media and Society (8:6), pp. 1009-1029.
A20
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
LaRose, R., and Rifon, N. J. 2007. “Promoting i-Safety: Effects of Privacy Warnings and Privacy Seals on Risk Assessment and Online
Privacy Behavior,” Journal of Consumer Affairs (41:1), pp. 127-149.
Lau, T., Etzioni, O., and Weld, D. S. 1999. “Privacy Interfaces for Information Management,” Communications of the ACM (42:10), pp. 88-94.
Laudon, K. C. 1996. “Markets and Privacy,” Communications of the ACM (39:9), pp. 92-104.
Laufer, R. S., and Wolfe, M. 1977. “Privacy as a Concept and a Social Issue: A Multidimensional Developmental Theory,” Journal of Social
Issues (33), pp. 22-41.
Lederer, S., Hong, J. I., Dey, A. K., and Landay, J. A. 2004. “Personal Privacy through Understanding and Action: Five Pitfalls for
Designers,” Personal and Ubiquitous Computing (8:6), pp. 440-454.
Lederer, S., Mankoff, J., and Dey, K. A. 2003. “Who Wants to Know What When? Privacy Preference Determinants in Ubiquitous
Computing,” in Proceedings of Conferences on Human Factors in Computing Systems, Fort Lauderdale, FL.
Lee, C-P. 2005. “Public Sector Information in the Digital Age,” Information Technology & People (18:4), pp. 405-407.
Lee, L. T. 2002. “Defining Privacy: Freedom in a Democratic Constitutional State,” Journal of Broadcasting & Electronic Media (46:4),
pp. 646-651.
Levin, H. A., and Askin, F. 1977. “Privacy in Courts – Law and Social-Reality,” Journal of Social Issues (33:3), pp. 138-153.
Li, X. B., and Sarkar, S. 2006. “Privacy Protection in Data Mining: A Perturbation Approach for Categorical Data,” Information Systems
Research (17:3), pp. 254-270.
Lin, Z., Owen, A. B., and Altman, R. B. 2004. “Genomic Research and Human Subject Privacy,” Science (305:5681), pp. 183-183.
Linowes, D. F. 1989. Privacy in America: Is Your Private Life in the Public Eye?, Urbana,, IL: University of Illinois Press.
Little, L., Briggs, P., and Coventry, L. 2005. “Public Space Systems: Designing for Privacy?,” International Journal of Human-Computer
Studies (63:1/2), pp. 254-268.
Liu, C., Marchewka, J. T., Lu, J., and Yu, C. S. 2005. “Beyond Concern – A Privacy-Trust-Behavioral Intention Model of Electronic
Commerce,” Information & Management (42:2), pp. 289-304.
Livraghi, G., and Monti, A. 2002. “The Network Society as Seen from Italy,” Information Society (18:3), pp. 165-179.
Lu, Y., Tan, B. C. Y., and Hui, K.-L. 2004. “Inducing Customers to Disclose Personal Information to Internet Businesses with Social
Adjustment Benefits,” in Proceedings of 25th International Conference on Information Systems, R. Agarwal, L. J. Kirsch, and J. I. DeGross
(eds.), Washington, DC, pp. 272-281.
Luo, X. 2002. “Trust Production and Privacy Concerns on the Internet: A Framework Based on Relationship Marketing and Social Exchange
Theory,” Industrial Marketing Management (31:2), pp. 111-118.
Lwin, M. O., and Williams, J. D. 2003. “A Model Integrating the Multidimensional Developmental Theory of Privacy and Theory of Planned
Behavior to Examine Fabrication of Information Online,” Marketing Letters (14:4), pp. 257-272.
Lyon, D. 1994. The Electronic Eye: The Rise of the Surveillance Society, Minneapolis, MN: University of Minnesota Press.
Lyon, D. 2001a. “Facing the Future: Seeking Ethics for Everyday Surveillance,” Ethics and Information Technology (3:3), pp. 171-180.
Lyon, D. 2001b. Surveillance Society: Monitoring Everyday Life, Philadelphia, PA: Open University Press.
Lyon, D. 2002. “Surveillance in Cyberspace: The Internet, Personal Data, and Social Control,” Queen’s Quarterly (109:3), pp. 1-12.
Lyon, D., and Zureik, E. 1996. Computers, Surveillance, and Privacy, Minneapolis, MN: University of Minnesota Press.
Madgwick, D., and Smythe, T. 1974. The Invasion of Privacy, London: Pitman.
Mael, F. A., Connerley, M., and Morath, R. A. 1996. “None of Your Business: Parameters of Biodata Invasiveness,” Personnel Psychology
(49:3), pp. 613-650.
Malhotra, N. K., Kim, S. S., and Agarwal, J. 2004. “Internet Users’ Information Privacy Concerns (IUIPC): The Construct, the Scale, and
a Causal Model,” Information Systems Research (15:4), pp. 336-355.
Margulis, S. T. 1974. “Privacy as Behavioral Phenomenon: Coming of Age (1),” in Man-Environment Interactions: Evaluations and
Applications: Part 2, D. H. Carson (ed.), Washington, DC: Environmental Design Research Association, pp. 101-123.
Margulis, S. T. 1977a. “Conceptions of Privacy – Current Status and Next Steps,” Journal of Social Issues (33:3), pp. 5-21.
Margulis, S. T. 1977b. “Privacy as a Behavioral Phenomenon – Introduction,” Journal of Social Issues (33:3), pp. 1-4.
Margulis, S. T. 2003a. “On the Status and Contribution of Westin’s and Altman’s Theories of Privacy,” Journal of Social Issues (59:2), pp.
411-429.
Margulis, S. T. 2003b. “Privacy as a Social Issue and Behavioral Concept,” Journal of Social Issues (59:2), pp. 243-261.
Marshall, N. J. 1974. “Dimensions of Privacy Preferences,” Multivariate Behavioral Research (9:3), pp. 255-271.
Martin, J. 1973. Security, Accuracy, and Privacy in Computer Systems, Englewood Cliffs, NJ: Prentice-Hall.
Marx, G. T. 1999. “What’s in a Name? Some Reflections on the Sociology of Anonymity,” Information Society (15:2), pp. 99-112.
Marx, G. T. 2001. “Murky Conceptual Waters: The Public and the Private,” Ethics and Information Technology (3:3), pp. 157-169.
Marx, G. T. 2003. “A Tack in the Shoe: Neutralizing and Resisting the New Surveillance,” Journal of Social Issues 2003, pp. 369-390.
Mason, R. O. 1986. “Four Ethical Issues of the Information Age,” MIS Quarterly (10:1), pp. 4-12.
McCrohan, K. F. 1989. “Information Technology, Privacy, and the Public Good,” Journal of Public Policy and Marketing (8:1), pp. 265-278.
McGinity, M. 2000. “Surfing Your Turf: For a Medium that Holds Anonymity in High Regard, Privacy Is Fast Eroding,” Communications
of the ACM (43:4), pp. 19-21.
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A21
Smith et al./Information Privacy Research
McKnight, D. H., and Chervany, N. L. 2001. “What Trust Means in E-Commerce Customer Relationships: An Interdisciplinary Conceptual
Typology,” International Journal of Electronic Commerce (6:2), pp. 35-59.
McLean, D. 1995. Privacy and its invasion, Westport, CT: Praeger.
McRobb, S., and Rogerson, S. 2004. “Are They Really Listening? An Investigation into Published Online Privacy Policies at the Beginning
of the Third Millennium,” Information Technology & People (17:4), pp. 442-461.
McWhirter, D. A. 1994. Search, Seizure, and Privacy, Phoenix, AZ: Oryx.
Meeks, B. N. 1997. “Privacy Lost, Anytime, Anywhere,” Communications of the ACM (40:8), pp. 11-13.
Meeks, B. N. 1999. “The Privacy Hoax,” Communications of the ACM (42:2), pp. 17-19.
Meeks, B. N. 2003. “Conspicuous in Their Silence: Where Are the Voices Defending the Very Fought-After Privacy Rights Now Threatened
in the Name of Homeland Security?,” Communications of the ACM (46:2), pp. 15-16.
Melanson, P. H. 2001. Secrecy Wars: National Security, Privacy, and the Public’s Right to Know, Washington, DC: Brassey’s.
Metzger, M. J. 2004. “Privacy, Trust, and Disclosure: Exploring Barriers to Electronic Commerce,” Journal of Computer-Mediated
Communication (9:4).
Metzger, M. J. 2007. “Communication Privacy Management in Electronic Commerce “ Journal of Computer-Mediated Communication (12:2),
pp. 335-361.
Metzger, M. J., and Docter, S. 2003. “Public Opinion and Policy Initiatives for Online Privacy Protection,” Journal of Broadcasting &
Electronic Media (47:3), pp. 350-375.
Michael, J. 1994. Privacy and Human Rights: An International and Comparative Study, with Special Reference to Developments in
Information Technology, Brookfield, VT: Dartmouth Publishing Company.
Milberg, S. J., Burke, S. J., Smith, H. J., and Kallman, E. A. 1995. “Values, Personal Information Privacy, and Regulatory Approaches,”
Communications of the ACM (38:12), pp. 65-74.
Milberg, S. J., Smith, H. J., and Burke, S. J. 2000. “Information Privacy: Corporate Management and National Regulation,” Organization
Science (11:1), pp. 35-57.
Miller, M. 2002. Absolute PC Security and Privacy, San Francisco: SYBEX.
Milne, G. R. 1997. “Consumer Participation in Mailing Lists: A Field Experiment,” Journal of Public Policy and Marketing (16:2), pp.
298-309.
Milne, G. R. 2000. “Privacy and Ethical Issues in Database/Interactive Marketing and Public Policy: A Research Framework and Overview
of the Special Issue,” Journal of Public Policy & Marketing (19:1), pp. 1-6.
Milne, G. R., and Boza, M.-E. 1999. “Trust and Concern in Consumers’ Perceptions of Marketing Information Management Practices,”
Journal of Interactive Marketing (13:1), pp. 5-24.
Milne, G. R., and Culnan, M. J. 2002. “Using the Content of Online Privacy Notices to Inform Public Policy: A Longitudinal Analysis of
the 1998-2001 US Web Surveys,” Information Society (18:5), pp. 345-359.
Milne, G. R., and Culnan, M. J. 2004. “Strategies for Reducing Online Privacy Risks: Why Consumers Read (or Don’t Read) Online Privacy
Notices,” Journal of Interactive Marketing (18:3), pp. 15-29.
Milne, G. R., and Gordon, E. M. 1993. “Direct Mail Privacy-Efficiency Trade-Offs Within an Implied Social Contract Framework,” Journal
of Public Policy and Marketing (12:2), pp. 206-215.
Milne, G. R., and Rohm, A. 2000. “Consumer Privacy and Name Removal Across Direct Marketing Channels: Exploring Opt-in and Opt-out
Alternatives,” Journal of Public Policy and Marketing (19:2), pp. 238-249.
Milne, G. R., Rohm, A. J., and Bahl, S. 2004. “Consumers’ Protection of Online Privacy and Identity,” Journal of Consumer Affairs (38:2),
pp. 217-232.
Miyazaki, A., and Krishnamurthy, S. 2002. “Internet Seals of Approval: Effects on Online Privacy Policies and Consumer Perceptions,”
Journal of Consumer Affairs (36:1), pp. 28-49.
Miyazaki, A. D., and Fernandez, A. 2000. “Internet Privacy and Security: An Examination of Online Retailer Disclosures,” Journal of Public
Policy & Marketing (19:1), pp. 54-61.
Miyazaki, A. D., and Fernandez, A. 2001. “Consumer Perceptions of Privacy and Security Risks for Online Shopping,” Journal of Consumer
Affairs (35:1), pp. 27-44.
Mizell, L. R. 1998. Invasion of Privacy, New York: Berkley Books.
Monmonier, M. S. 2002. Spying with Maps: Surveillance Technologies and the Future of Privacy, Chicago: University of Chicago Press.
Moore, A. D. 2005. Information Ethics: Privacy, Property, and Power, Seattle, WA: University of Washington Press.
Moore, B. 1984. Privacy: Studies in Social and Cultural History, New York: M. E. Sharpe.
Moores, T. 2005. “Do Consumers Understand the Role of Privacy Seals in E-Commerce?,” Communications of the ACM (48:3), pp. 86-91.
Moores, T. T., and Dhillon, G. 2003. “Do Privacy Seals in E-Commerce Really Work?,” Communications of the ACM (46:12), pp. 265-271.
Nakada, M., and Tamura, T. 2005. “Japanese Conceptions of Privacy: An Intercultural Perspective,” Ethics and Information Technology (7:1),
pp. 27-36.
Negley, G. 1966. “Philosophical Views on the Value of Privacy,” Law and Contemporary Problems (31:2), pp. 319-325.
Neumann, P. G. 2005. “Don’t Trade Privacy for Security – Reply,” Communications of the ACM (48:2), pp. 13-13.
A22
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
Newell, P. B. 1998. “A Cross-Cultural Comparison of Privacy Definitions and Functions: A Systems Approach,” Journal of Environmental
Psychology (18:4), pp. 357-371.
Ng-Kruelle, G., Swatman, P. A., Rebne, D. S., and Hampe, J. F. 2003. “The Price of Convenience: Developing a Framework for Analysing
Privacy Sensitivity in the Adoption of Wireless Applications,” in Proceedings of 16th Bled Electronic Commerce Conference, Bled, Slovenia,
pp. 480-500.
Nicholson, S., and Smith, C. A. 2007. “Using Lessons from Health Care to Protect the Privacy of Library Users: Guidelines for the
De-identification of Library Data Based on HIPAA,” Journal of the American Society for Information Science and Technology (58:8), pp.
1198-1206.
Nissenbaum, H. 1998. “Protecting Privacy in an Information Age: The Problem of Privacy in Public,” Law and Philosophy (17:5), pp.
559-596.
Nissenbaum, H. 2004. “Privacy as Contextual Integrity,” Washington Law Review (79:1), pp. 119-158.
Nock, S. L. 1993. The Costs of Privacy: Surveillance and Reputation in America, New York: Aldine.
Norberg, P. A., and Horne, D. R. 2007. “Privacy Attitudes and Privacy-Related Behavior,” Psychology and Marketing (24:10), pp. 829-847.
Norberg, P. A., Horne, D. R., and Horne, D. A. 2007. “The Privacy Paradox: Personal Information Disclosure Intentions Versus Behaviors,”
Journal of Consumer Affairs (41:1), pp. 100-126.
Nord, G. D., McCubbins, T. F., and Nord, J. H. 2006. “E-Monitoring in the Workplace: Privacy, Legislation, and Surveillance Software,”
Communications of the ACM (49:8), pp. 72-77.
Nowak, G. J., and Phelps, J. 1992. “Understanding Privacy Concerns: An Assessment of Consumers’s Information-Related Knowledge and
Beliefs,” Journal of Direct Marketing (6:4), pp. 28-39.
Nowak, G. J., and Phelps, J. 1997. “Direct Marketing and the Use of Individual-Level Consumer Information: Determining How and When
‘Privacy’ Matters,” Journal of Direct Marketing (11:4), pp. 94-108.
O’Brien, D. M. 1979. Privacy, Law, and Public Policy, New York: Praeger.
Odlyzko, A. 2004. “Privacy, Economics, and Price Discrimination on the Internet,” in Economics of Information Security, L. J. Camp and
S. Lewis (eds.), New York: Springer, pp. 187-211.
Packard, V. O. 1964. The Naked Society, New York: David McCay.
Pan, Y., and Zinkhan, G. M. 2006. “Exploring the Impact of Online Privacy Disclosures on Consumer Trust,” Journal of Retailing (82:4),
pp. 331-338.
Patil, S., and Kobsa, A. 2005. “Uncovering Privacy Attitudes and Practices in Instant Messaging,” in Proceedings of the 2005 International
ACM SIGGROUP Conference on Supporting Group Work, New York: ACM, pp. 109-112.
Paul, D. L., Pearlson, K. E., and McDaniel, R. R. 1999. “Assessing Technological Barriers to Telemedicine: Technology-Management
Implications,” IEEE Transactions on Engineering Management (46:3), pp. 279-288.
Payton, F. C. 2003. “E-Health Models Leading to Business-to-Employee Commerce in the Human Resources Function,” Journal of
Organizational Computing and Electronic Commerce (13:2), pp. 147-161.
Pedersen, D. M. 1997. “Psychological Functions of Privacy,” Journal of Environmental Psychology (17:2), pp. 147-156.
Pedersen, D. M. 1999. “Model for Types of Privacy by Privacy Functions,” Journal of Environmental Psychology (19:4), pp. 397-405.
Pember, D. R. 1972. Privacy and the Press: The Law, the Mass Media, and the First Amendment, Seattle, WA: University of Washington
Press.
Penders, J. 2004. “Privacy in (Mobile), Telecommunications Services,” Ethics and Information Technology (6:4), pp. 247-260.
Pennock, J. R., and Chapman, J. W. 1971. Privacy, New York: Atherton Press.
Peslak, A. R. 2005. “Internet Privacy Policies: A Review and Survey of the Fortune 50,” Information Resources Management Journal (18:1),
pp. 29-41.
Peters, T. A. 1998. “Review of Privacy on the Line: The Politics of Wiretapping and Encryption, by Whitfield Diffie and Susan Landau,”
Journal of the American Society for Information Science (49:14), pp. 1339-1340.
Peters, T. A. 1999. Computerized Monitoring and Online Privacy, Jefferson, NC: McFarland & Co.
Petrison, L. A., and Wang, P. 1995. “Exploring the Eimensions of Consumer Privacy: An Analysis of Coverage in British and American
Media,” Journal of Direct Marketing (9:4), pp. 19-37.
Petronio, S. S. 2000. Balancing the Secrets of Private Disclosures, Mahwah, NJ: Lawrence Erlbaum Associates.
Petronio, S. S. 2002. Boundaries of Privacy: Dialectics of Disclosure, Albany, NY: State University of New York Press.
Petty, R. D. 2000. “Marketing Without Consent: Consumer Choice and Costs, Privacy, and Public Policy,” Journal of Public Policy and
Marketing (19:1), pp. 42-57.
Phelps, J. E., D’Souza, G., and Nowak, G. J. 2001. “Antecedents and Consequences of Consumer Privacy Concerns: An Empirical
Investigation,” Journal of Interactive Marketing (15:4), pp. 2-17.
Phelps, J. E., Gonzenbach, W., and Johnson, E. 1994. “Press Coverage and Public Perception of Direct Marketing and Consumer Privacy,”
Journal of Direct Marketing (8:2), pp. 9-22.
Phelps, J. E., Nowak, G., and Ferrell, E. 2000. “Privacy Concerns and Consumer Willingness to Provide Personal Information,” Journal of
Public Policy and Marketing (19:1), pp. 27-41.
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A23
Smith et al./Information Privacy Research
Podesta, J., Shane, P. M., and Leone, R. C. 2004. A Little Knowledge: Privacy, Security, and Public Information After September 11, New
York: Century Foundation Press.
Poindexter, J. C., Earp, J. B., and Baumer, D. L. 2006. “An Experimental Economics Approach Toward Quantifying Online Privacy Choices,”
Information Systems Frontiers (8:5), pp. 363-374.
Polat, H., and Du, W. L. 2006. “Privacy-Preserving Collaborative Filtering,” International Journal of Electronic Commerce (9:4), pp. 9-35.
Posner, R. A. 1981. “The Economics of Privacy,” American Economic Review (71:2), pp. 405-409.
Posner, R. A. 1984. “An Economic Theory of Privacy,” in Philosophical Dimensions of Privacy: An Anthology, F. Schoeman (ed.),
Cambridge, UK: Cambridge University Press, pp. 333-345.
Post, R. C. 1989. “The Social Foundations of Privacy: Community and Self in the Common Law Tort,” California Law Review (77), pp.
957-959.
Post, R. C. 2000. “Three Concepts of Privacy,” Georgetown Law Journal (89), pp. 2087-2098.
Pottie, G. J. 2004. “Privacy in the Global E-Village,” Communications of the ACM (47:2), pp. 21-23.
Prendergast, C. 2002. “The Tenuous Trade-Off between Risk and Incentives “ Journal of Political Economy (110:5), pp. 1071-1102.
Price, B. A., Adam, K., and Nuseibeh, B. 2005. “Keeping Ubiquitous Computing to Yourself: A Practical Model for User Control of Privacy,”
International Journal of Human-Computer Studies (63:1-2), pp. 228-253.
Purcell, R., and Fusaro, R. 2000. “Chief Privacy Officer,” Harvard Business Review (78:6).
Raab, C. D., and Bennett, C. J. 1998. “The Distribution of Privacy Risks: Who Needs Protection?,” Information Society (14:4), pp. 263-274.
Raines, J. C. 1974. Attack on Privacy, Valley Forge, PA: Judson Press.
Ranganathan, C., and Ganapathy, S. 2002. “Key Dimensions of Business-to-Consumer Web Sites,” Information & Management (39:6), pp.
457-465.
Reagle, J., and Cranor, L. F. 1999. “The Platform for Privacy Preferences,” Communications of the ACM (42:2), pp. 48-55.
Regan, P. M. 1995. Legislating Privacy: Technology, Social Values, and Public Policy, Chapel Hill, NC: University of North Carolina Press.
Regan, P. M. 2003. “Safe Harbors or Free Frontiers? Privacy and Transborder Data Flows,” Journal of Social Issues (59:2), pp. 263-282.
Reiman, J. 1984. “Privacy, Intimacy, and Personhood,” in Philosophical Dimensions of Privacy F. Schoeman (ed.), Cambridge, UK:
Cambridge University Press, pp. 300-316.
Rensel, A. D., Abbas, J. M., and Rao, H. R. 2006. “Private Transactions in Public Places: An Exploration of the Impact of the Computer
Environment on Public Transactional Web Site Use,” Journal of the Association for Information Systems (7:1), pp. 19-50.
Ribak, R., and Turow, J. 2003. “Internet Power and Social Context: A Globalization Approach to Web Privacy Concerns,” Journal of
Broadcasting & Electronic Media (47:3), pp. 328-350.
Rifon, N. J., LaRose, R., and Choi, S. M. 2005. “Your Privacy Is Sealed: Effects of Web Privacy Seals on Trust and Personal Disclosures,”
Journal of Consumer Affairs (39:2), pp. 339-362.
Rindfleisch, T. C. 1997. “Privacy, Information Technology, and Health Care,” Communications of the ACM (40:8), pp. 92-100.
Roberts, J. M., and Gregor, I. 1971. “Privacy: A Cultural View,” in Privacy, J. Penncock and J. Chapman (eds.), New York: Atherton Press,
pp. 199-225.
Roboff, G., and Charles, C. 1998. “Privacy of Financial Information in Cyberspace: Banks Addressing What Consumers Want,” Journal of
Retail Banking Services (20:3), pp. 51-56.
Rosen, J. 2000. The Unwanted Gaze: The Destruction of Privacy in America, New York: Random House.
Rosen, J. 2003. “A Cautionary Tale for a New Age of Surveillance,” in Terrorism in Perspective, P. Griset and S. Mahan (eds.), Newbury
Park, CA: Sage Publications.
Rosenfilld, J. R. 1995. “From the Practitioners: It’s Time to Sound the Privacy Alarm!,” Journal of Direct Marketing (9:3), pp. 1-4.
Rubin, M. R. 1988. Private Rights, Public Wrongs: The Computer and Personal Privacy, Norwood, NJ: Ablex Publishing.
Rust, R. T., Kannan, P. K., and Peng, N. 2002. “The Customer Economics of Internet Privacy,” Journal of the Academy of Marketing Science
(30:4), pp. 455-464.
Sagi, J., Carayannis, E., Dasgupta, S., and Thomas, G. 2004. “ICT and Business in the New Economy: Globalization and Attitudes Towards
eCommerce,” Journal of Global Information Management (12:3), pp. 44-64.
Salomon, D. 2003. Data Privacy and Security, New York: Springer.
Schachter, M. 2003. Informational and Decisional Privacy, Durham, NC: Carolina Academic Press.
Schein, V. E. 1977. “Individual Privacy and Personnel Psychology - Need for a Broader Perspective,” Journal of Social Issues (33:3), pp.
154-168.
Schneier, B., and Banisar, D. 1997. The Electronic Privacy Papers: Documents on the Battle for Privacy in the Age of Surveillance, New
York: Wiley.
Schoeman, F. D. 1984. Philosophical Dimensions of Privacy: An Anthology, Cambridge, UK: Cambridge University Press.
Schultz, D. E. 1988. “On Merging, Purging and Privacy,” Journal of Direct Marketing (2:1), pp. 4-5.
Schwaig, K. S., Kane, G. C., and Storey, V. C. 2005. “Privacy, Fair Information Practices and the Fortune 500: The Virtual Reality of
Compliance,” ACM SIGMIS Database (36:1), pp. 49-63.
A24
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
Schwaig, K. S., Kane, G. C., and Storey, V. C. 2006. “Compliance to the Fair Information Practices: How are the Fortune 500 Handling
Online Privacy Disclosures?,” Information & Management (43:7), pp. 805-820.
Schwartz, P. M. 1997. “Privacy and the Economics of Personal Health Care Information,” Texas Law Review (76:1), pp. 1-75.
Sewell, G., and Barker, J. R. 2001. “Neither Good, Nor Bad, But Dangerous: Surveillance as an Ethical Paradox,” Ethics and Information
Technology (3:3), pp. 181-194.
Shah, J. R., White, G. L., and Cook, J. R. 2007. “Privacy Protection Overseas as Perceived by USA-Based IT Professionals,” Journal of
Global Information Management (15:1), pp. 68-81.
Shamis, G. S., Green, M. C., Sorensen, S. M., and Kyle, D. L. 2005. “Outsourcing, Offshoring, Nearshoring: What to Do? Whether You
Outsource Locally or Globally, You’re Still Responsible for Client Privacy,” Journal of Accountancy (199:6), pp. 57-62.
Shapira, B., Elovici, Y., Meshiach, A., and Kuflik, T. 2005. “PRAW – The Model for PRivAte Web,” Journal of the American Society of
Information Science and Technology (56:2), pp. 159-172.
Shapiro, S. 1998. “Places and Spaces: The Historical Interaction of Technology, Home, and Privacy,” Information Society (14:4), pp. 275-284.
Sheehan, K. B. 1999. “An Investigation of Gender Differences in On-Line Privacy Concerns and Resultant Behaviors,” Journal of Interactive
Marketing (13:4), pp. 24-38.
Sheehan, K. B. 1999. “Toward a Typology of Internet Users and Online Privacy Concerns,” Information Society (18:1), pp. 21-32.
Sheehan, K. B., and Hoy, M. G. 1999. “Flaming, Complaining, Abstaining: How Online Users Respond to Privacy Concerns,” Journal of
Advertising (28:3), pp. 37-51.
Sheehan, K. B., and Hoy, M. G. 2000. “Dimensions of Privacy Concern Among Online Consumers,” Journal of Public Policy & Marketing
(19:1), pp. 62-73.
Shils, E. 1966. “Privacy: Its Constitution and Vicissitudes,” Law and Contemporary Problems (31:2), pp. 281-306.
Siau, K., and Kam, H. J. 2006. “e-Healthcare in ABC County Health Department (ABCCHD)(1): Trade-Offs Analysis and Evaluation,”
Journal of Information Technology (21:1), pp. 66-71.
Simmel, A. 1971. “Privacy Is Not an Isolated Freedom,” in Privacy, J. R. Pennock and J. W. Chapman (eds.), New York: Atherton Press,
pp. 71-87.
Sipior, J. C., and Ward, B. T. 1995. “The Ethical and Legal Quandary of Email Privacy,” Communications of the ACM (38:12), pp. 48-54.
Sipior, J. C., Ward, B. T., and Rainone, S. M. 1998. “Ethical Management of Employee E-Mail Privacy,” Information Systems Management
(15:1), pp. 41-47.
Skousen, M. 1982. Mark Skousen’s Complete Guide to Financial Privacy (3rd rev. ed.), Alexandria, VA: Alexandria House Books.
Slough, M. C. 1969. Privacy, Freedom, and Responsibility, Springfield, IL: Thomas.
Smith, H. J. 1994. Managing Privacy: Information Technology and Corporate America, Chapel Hill, NC: University of North Carolina Press.
Smith, H. J. 2001. “Information Privacy and Marketing: What the US Should (and Shouldn’t) Learn from Europe,” California Management
Review (43:2), pp. 8-33.
Smith, H. J. 2004. “Information Privacy and its Management,” MIS Quarterly Executive (3:4), pp. 201-213.
Smith, H. J., Milberg, J. S., and Burke, J. S. 1996. “Information Privacy: Measuring Individuals’ Concerns About Organizational Practices,”
MIS Quarterly (20:2), pp. 167-196.
Smith, J. 1993. “Privacy Policies and Practices: Inside the Organization Maze,” Communications of the ACM (36:12), pp. 104-122.
Sobel, L. A. 1976. War on Privacy, New York: Facts on File.
Solove, D. J. 2004. The Digital Person: Technology and Privacy in the Information Age, New York: New York University Press.
Solove, D. J. 2006. “A Taxonomy of Privacy,” University of Pennsylvania Law Review (154:3), pp. 477-560.
Solove, D. J. 2007. “‘I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy,” San Diego Law Reviews (44).
Solove, D. J. 2008. Understanding Privacy, Cambridge, MA: Harvard University Press.
Son, J. Y., and Kim, S. S. 2008. “Internet Users’ Information Privacy-Protective Responses: A Taxonomy and a Nomological Model,” MIS
Quarterly (32:3), pp. 503-529.
Sopinka, J. 1997. “Freedom of Speech and Privacy in the Information Age,” Information Society (13:2), pp. 171-184.
Spangler, W. E., Hartzel, K. H., and Gal-or, M. 2006. “Exploring the Privacy Implications of Addressable Advertising and Viewer Profiling,”
Communications of the ACM (49:5), pp. 119-123.
Spencer, T. 2006. “Privacy and Transparency: The Race to the Bottom,” Journal of Public Affairs (6:2), pp. 168-170.
Spiekermann, S., Grossklags, J., and Berendt, B. 2001. “E-Privacy in 2nd Generation E-Commerce: Privacy Preferences Versus Actual
Behavior,” in Proceedings of the 3rd ACM conference on Electronic Commerce, New York: ACM, pp. 38-47.
Stalder, F. 2002a. “Failures and Successes: Notes on the Development of Electronic Cash,” Information Society (18:3), pp. 209-219.
Stalder, F. 2002b. “Opinion: Privacy Is Not the Antidote to Surveillance,” Surveillance & Society (1:1), pp. 120-124.
Stallman, R. 2005. “To Defend Privacy, Don’t Collect Personal Data,” Communications of the ACM (48:5), pp. 11-11.
Stanton, J. M., and Stam, K. R. 2003. “Information Technology, Privacy, and Power Within Organizations: A View from Boundary Theory
and Social Exchange Perspectives,” Surveillance & Society (1:2), pp. 152-190.
Stewart, K. A., and Segars, A. H. 2002. “An Empirical Examination of the Concern for Information Privacy Instrument,” Information Systems
Research (13:1), pp. 36-49.
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A25
Smith et al./Information Privacy Research
Stigler, G. J. 1980. “An Introduction to Privacy in Economics and Politics,” The Journal of Legal Studies (9:4), pp. 623-644.
Stone, E. F., Gueutal, G. H., Gardner, D. G., and McClure, S. 1983. “A Field Experiment Comparing Information-Privacy Values, Beliefs,
and Attitudes Across Several Types of Organizations,” Journal of Applied Psychology (68:3), pp. 459-468.
Stone, E. F., and Stone, D. L. 1990. “Privacy in Organizations: Theoretical Issues, Research Findings, and Protection Mechanisms,” Research
in Personnel and Human Resources Management (8:3), pp. 349-411.
Stone-Romero, E. F., Stone, D. L., and Hyatt, D. 2003. “Personnel Selection Procedures and Invasion of Privacy,” Journal of Social Issues
(59:2), 2003, pp. 343-368.
Straub, D. W., and Collins, R. W. 1990. “Key Information Liability Issues Facing Managers: Software Piracy, Proprietary Databases, and
Individual Rights to Privacy,” MIS Quarterly (14:2), pp. 143-156.
Strickland, L. S., and Hunt, L. E. 2005. “Technology, Security, and Individual Privacy: New Tools, New Threats, and New Public
Perceptions,” Journal of the American Society for Information Science and Technology (56:3), pp. 221-234.
Suh, B., and Han, I. 2003. “The Impact of Customer Trust and Perception of Security Control on the Acceptance of Electronic Commerce,”
International Journal of Electronic Commerce (7:3), pp. 135-161.
Sundstrom, E., Burt, R. E., and Kamp, D. 1980. “Privacy at Work – Architectural Correlates of Job-Satisfaction and Job-Performance,”
Academy of Management Journal (23:1), pp. 101-117.
Swaminatha, T. M., and Elden, C. R. 2003. Wireless Security and Privacy: Best Practices and Design Techniques, Boston: Addison-Wesley.
Swire, P. P. 1997. “Markets, Self-Regulation, and Government Enforcement in the Protection of Personal Information,” in Privacy and
Self-Regulation in the Information Age, W. M. Daley and L. Irving (eds.), Washington, DC: Department of Commerce, pp. 3-19.
Swire, P. P. 1998. “Of Elephants, Mice, and Privacy: International Choice of Law and the Internet,” International Lawyer (32), pp. 991-1025.
Swire, P. P. 1999. “Financial Privacy and the Theory of High-Tech Government Surveillance,” Washington University Law Quarterly (77:2),
pp. 461-513.
Swire, P. P. 2001. “If Surveillance Expands, Safeguard Civil Liberties,” Atlanta Journal Constitution, October 21.
Swire, P. P. 2002. “Trustwrap, or Why E-Commerce Is Like a Bottle of Tylenol,” Johns Hopkins School of Foreign and International Affairs
Conference on New Technologies and International Governance, Washington, DC, February.
Swire, P. P. 2003. “Efficient Confidentiality for Privacy, Security, and Confidential Business Information,” Brookings-Wharton Papers on
Financial Services, pp. 273-310.
Swire, P. P., and Steinfeld, L. B. 2002. “Security and Privacy after September 11: The Health Care Example,” Minnesota Law Review
(86:1515), pp. 102-122.
Sykes, C. J. 1999. The End of Privacy, New York: St. Martin’s Press.
Tam, E. C., Hui, K. L., and Tan, B. C. Y. 2002. “What Do They Want? Motivating Consumers to Disclose Personal Information to Internet
Businesses,” in Proceeding of 23rd International Conference on Information Systems, L. Applegate, R. D. Galliers, and J. I. DeGross (eds.),
Barcelona, Spain, pp. 11-21.
Tang, Z., Hu, Y. J., and Smith, M. D. 2005. “Protecting Online Privacy: Self-Regulation, Mandatory Standards, or Caveat Emptor,” in
Proceedings of the Workshop on Economics and Information Security, Cambridge, MA.
Tang, Z., Hu, Y. J., and Smith, M. D. 2008. “Gaining Trust Through Online Privacy Protection: Self-Regulation, Mandatory Standards, or
Caveat Emptor,” Journal of Management Information Systems (24:4), pp. 153-173.
Taras, D. G., Bennett, J. T., and Townsend, A. M. 2004. Information Technology and the World of Work, New York: Transaction Publishers.
Taylor, C. R. 2004. “Consumer Privacy and the Market for Customer Information,” RAND Journal of Economics (35:4), pp. 631-650.
Taylor, N. 2002. “State Surveillance and the Right to Privacy,” Surveillance and Society (1:1), pp. 66-85.
Teich, A., Frankel, M. S., Kling, R., and Lee, Y. C. 1999. “Anonymous Communication Policies for the Internet: Results and
Recommendations of the AAAS Conference,” Information Society (15:2), pp. 71-77.
Thatcher, M. E., and Clemons, E. K. 2000. “Managing the Costs of Informational Privacy: Pure Bundling as a Strategy in the Individual
Health Insurance Market,” Journal of Management Information Systems (17:2), pp. 29-58.
Thelwall, M., and Stuart, D. 2006. “Web Crawling Ethics Revisited: Cost, Privacy, and Denial of Service,” Journal of the American Society
for Information Science and Technology (57:13), pp. 1771-1779.
Thomas, R. E., and Mauer, V. G. 1997. “Database Marketing Practice: Protecting Consumer Privacy,” Journal of Public Policy and
Marketing (16:1), pp. 147-155.
Tolchinsky, P. D., McCuddy, M. K., Adams, J., Ganster, D. C., Woodman, R. W., and Fromkin, H. L. 1981. “Employee Perceptions of
Invasion of Privacy: A Field Simulation Experiment,” Journal of Applied Psychology (66:3), pp. 308-313.
Turner, E. C., and Dasgupta, S. 2003. “Privacy on the Web: An Examination of User’s Concerns, Technology, and Implications for Business
Organizations and Individuals,” Information Systems Management (20:1), pp. 8-18.
Turow, J. 2003. “Americans & Online Privacy: The System is Broken,” report from the Annenberg Public Policy Center, University of
Pennsylvania.
Van Slyke, C., Shim, J. T., Johnson, R., and Jiang, J. J. 2006. “Concern for Information Privacy and Online Consumer Purchasing,” Journal
of the AIS (7:6), pp. 415-444.
A26
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
Smith et al./Information Privacy Research
Vijayasarathy, L. R. 2004. “Predicting Consumer Intentions to Use On-Line Shopping: The Case for an Augmented Technology Acceptance
Model,” Information & Management (41:6), pp. 747-762.
Vinsel, A., Brown, B. B., Altman, I., and Foss, C. 1980. “Privacy Regulation, Territorial Displays, and Effectiveness of Individual
Functioning,” Journal of Personality and Social Psychology (39:6), pp. 1104-1115.
Volkman, R. 2003. “Privacy as Life, Liberty, Property,” Ethics and Information Technology (5:4), pp. 199-210.
Wacks, R. 1993. Privacy, New York: New York University Press.
Walczuch, R. M., Singh, S. K., and Palmer, T. S. 1995. “An Analysis of the Cultural Motivations for Transborder Data Flow Legislation,”
Information Technology & People (8:2), pp. 97-114.
Walczuch, R. M., and Steeghs, L. 2001. “Implications of the New EU Directive on Data Protection for Multinational Corporations,”
Information Technology & People (14:2), pp. 142-162.
Waldo, J., Lin, H., and Millett, L. I. 2007. Engaging Privacy and Information Technology in a Digital Age, Washington, DC: National
Academies Press.
Ware, W. H. 1976. Privacy Issues and the Private Sector, Santa Monica, CA: Rand Corporation.
Warren, C., and Laslett, B. 1977. “Privacy and Secrecy – Conceptual Comparison,” Journal of Social Issues (33:3), pp. 43-51.
Warren, S. D., and Brandeis, D. L. 1890. “The Right to Privacy,” Harvard Law Review (4:5), pp. 193-220.
Weckert, J. 2005. Electronic Monitoring in the Workplace: Controversies and Solutions, Hershey, PA: Idea Group Publishing.
Weinstein, W. L. 1971. “The Private and the Free: A Conceptual Inquiry,” in Privacy: Nomos XIII, J. R. Pennock and J. W. Chapman (eds.),
New York: Atherton Press, pp. 624-692.
Weisband, S. P., and Reinig, B. A. 1995. “Managing User Perceptions of Email Privacy,” Communications of the ACM (38:12), pp. 40-47.
Westin, A. 2001. “Opinion Surveys: What Consumers Have to Say About Information Privacy,” Prepared Witness Testimony, The House
Committee on Energy and Commerce, W. J. “Billy” Tauzin, Chairman, May 8.
Westin, A. F. 1967. Privacy and Freedom, New York: Atheneum.
Westin, A. F. 2003. “Social and Political Dimensions of Privacy,” Journal of Social Issues (59:2), pp. 431-453.
Whitaker, R. 1999. The End of Privacy: How Total Surveillance Is Becoming a Reality, New York: New Press.
Winer, R. S. 2001. “A Framework for Customer Relationship Management,” California Management Review (43:4), pp. 8-33.
Woodman, R. W., Ganster, D. C., Adams, J., McCuddy, M. K., Tolchinsky, P. D., and Fromkin, H. 1982. “A Survey of Employee Perceptions
of Information Privacy in Organizations,” Academy of Management Journal (25:3), pp. 647-663.
Xie, E., Teo, H. H., and Wan, W. 2006. “Volunteering Personal Information on the Internet: Effects of Reputation, Privacy Notices, and
Rewards on Online Consumer Behavior,” Marketing Letters (17:1), pp. 61-74.
Xu, H. 2007. “The Effects of Self-Construal and Perceived Control on Privacy Concerns,” in Proceedings of the 28th International Conference
on Information Systems, Montréal, Canada.
Xu, Y., Tan, B., Hui, K., and Tang, W. 2003. “Consumer Trust and Online Information Privacy,” in Proceedings of the 24th International
Conference on Information Systems, S. T. March, A. P. Massey, and J. I. DeGross (eds.), Seattle, WA, pp. 538-548.
Xu, H., and Teo, H. H. 2004. “Alleviating Consumer’s Privacy Concern in Location-Based Services: A Psychological Control Perspective,”in
Proceedings of the 25th International Conference on Information Systems, R. Agarwal, L. J. Kirsch, and J. I. DeGross (eds.), Washington,
DC, pp. 793-806.
Xu, H., and Teo, H. H. 2005a. “Consumers’ Privacy Concerns toward Using Location-Based Services: An Exploratory Framework and
Research Proposal,” in Proceedings of 13th European Conference on Information Systems, Regensburg, Germany.
Xu, H., and Teo, H. H. 2005b. “Privacy Considerations in Location-Based Advertising,” in Designing Ubiquitous Information Environments:
Socio-Technical Issues and Challenges, C. Sorensen, Y. Yoo, K. Lyytinen and J. I. DeGross (eds.), New York: Springer, pp. 71-90.
Xu, H., Teo, H. H., and Tan, B. C. Y. 2005. “Predicting the Adoption of Location-Based Services: The Roles of Trust and Privacy Risk,”
in Proceedings of 26th International Conference on Information Systems, D. Avison, D. F. Galletta, and J. I. DeGross (eds.), Las Vegas, NV,
pp. 897-910.
Xu, H., Teo, H. H., and Tan, B. C. Y. 2006. “Information Privacy in the Digital Era: An Exploratory Research Framework,” in Proceedings
of 12th Americas Conference on Information Systems, Acapulco, Mexico.
Yao, M. Z., Rice, R. E., and Wallis, K. 2007. “Predicting User Concerns About Online Privacy,” Journal of the American Society for
Information Science and Technology (58:5), pp. 710-722.
Young, J. B. 1978. Privacy, Chichester, UK: John Wiley & Sons.
Zakaria, N., Stanton, J. M., and Sarker-Barney, S. T. M. 2003. “Designing and Implementing Culturally-Sensitive IT Applications: The
Interaction of Culture Values and Privacy Issues in the Middle East,” Information Technology & People (16:1), pp. 49-75.
Zwick, D., and Dholakia, N. 1999. “Models of Privacy in the Digital Age: Implications for Marketing and E-Commerce,” Research Institute
for Telecommunications and Information Marketing, University of Rhode Island.
MIS Quarterly Vol. 35 No. 4—Appendices/December 2011
A27