This is an excerpt from the prospectus for Sloan and... A. Information: Collection, Use, and Distribution

advertisement
This is an excerpt from the prospectus for Sloan and Warner,
Unauthorized Access: The Crisis in Online Privacy and Security,
forthcoming.
The Law and Economics of Liability for Unauthorized Access
A. Information: Collection, Use, and Distribution
Computers, database technology, and the Internet make it possible to
collect, aggregate, store, analyze, and distribute vast amounts of
information. The technology enables mass surveillance—constant
surveillance of almost everyone over a wide range of activities. The result is
a reduction in our ability to control when, how, and for what purpose
information is being collected and used.
B. Privacy Harms and the Public/Private Distinction
Routine mass surveillance compromises privacy. Our explanation of
the harms involved begins with the observation that our social interactions
are governed in part by norms that define a public/private distinction. They
do so by defining what personal information it is proper for others to know.
It is proper for the pharmacy clerk to ask for your name and address but not
for your vacation plans; your health insurance company may obtain and use
detailed health information but the company would cross the line if it inquired
whether you were currently happy in your marriage; your personal physician,
however, may ask that.
We advance two claims. First, the technological developments
described in the previous section have severely eroded the public/private
distinction; what was widely regarded as private in the 1950’s is now
regarded as a routinely available to a wide variety of businesses and
organizations. The second claim is that some public/private distinction is
essential. We contend that the erosion of the current distinction has had the
following adverse consequences (to one degree or another): an increased
risk of harm to individuals; information overload; the possession of easily
abused power by credit agencies, insurance companies, and businesses
generally; a chilling effect on decision-making; reduced opportunities for the
development of the self; an increased ability to enforce rules and laws to an
extent that creates a merciless “Big Brother” inconsistent with both justice
and forgiveness.
C. The Current Legal Regime
This section provides an overview (easily accessible to non-lawyers) of
the laws pertaining to privacy. The law has failed to respond adequately to
the technological assault on the public/private distinction. To see why, we
ask readers to turn the clock back to the mid-twentieth century. It was not
difficult for each of us to ensure that whatever he or she thought should be
private would in fact be private. Technology has largely taken this power
away. It is now much more difficult for one to ensure that what one thinks
ought to be private is in fact private. One may not know when information is
being collected, used, or distributed. Even if one does know, one may have
little control over when, how, or why the information is being processed. Our
review of the law shows it has not yet responded adequately to the limitation
in the ability to ensure that what one thinks ought to be private will in fact be
private. We contend that an adequate response requires the development of
norms defining a distinction between public and private.
D. Why Not Just Get Consent?
Why not, instead, simply require that businesses obtain our informed
consent before they collect certain types of information? Doesn’t this
requirement reestablish the power to ensure that what one thinks ought to
be private will in fact be private? The problem is that one typically “gives
consent” via standard-form, no-negotiation contracts. One does so when one
visits a web site, obtains a mortgage, opens a bank account, and so on for an
immense variety of interactions. Almost no one reads such contracts. So
how can consent be informed? [Note, that we gave a much fuller argument in
class; the powerpoint slides contain the argument.]
III.
Tradeoffs, Economics, and Information Security
A. The Economic Advantages of Information Processing
We offer a framework to analyze privacy and security issues. Our goal
is to provide an approach to determining how to trade privacy off against
other values. We begin with the critical role that information plays in
enabling the market exchanges that provide us with goods and services. An
adequate flow information is essential to market efficiency. Efficiency is
important: the more efficient one is in obtaining an end, the more one has
left over to invest in other pursuits. We illustrate the point with three
important uses of information: the facilitation of the extension of credit;
targeting advertising, and price discrimination. These examples are hardly
unique; information is also critical to setting insurance premiums, making
employment decisions, and news reporting, to take just a few more
examples.
1. Facilitating the extension of credit
Collection, analysis, and transfer of information are essential to
banking and the extension of credit. Informed decisions regarding the
extension of credit are essential to an efficiently functioning economy.
2. Targeting Advertising
Targeting advertising is the process of matching advertising to
recipients in ways that maximize the likelihood that recipients will purchase
in response. Targeting makes advertising more efficient. In doing so, it does
not merely benefit businesses; it benefits consumers by reducing the amount
of irrelevant information that bombards them. Technology has greatly
increased the ability of businesses to target advertising by increasing the
ability to collect, store, aggregate, and analyze information. This same
technological development also enables price discrimination (charging
different buyers different prices).
3. Price Discrimination
Price discrimination can generate significant economic efficiencies
which benefit both sellers and buyers. Realizing these efficiencies requires
sorting buyers into groups according to their willingness to pay, and that
requires a significant amount of information. Consequently, sellers structure
their interactions so that they can collect and use the necessary information.
4. Information as Power
(1) – (3) above are among the many illustrations that information is
power. Information is power because the more relevant information one has
in regard to an end, the more effective one is, other things being equal, in
achieving it. “Other things” may not be “equal,” of course. Information
overload can decrease effectiveness, and incomplete information can
mislead. Such exceptions aside, however, relevant information empowers
businesses by enabling them to discriminate among types of consumers in
order to treat the different types in ways most likely to yield the results the
business desires. This is why
the logic . . . of surveillance systems is to grow. Given that the
efficient pursuit of discrimination among persons is their raison d’etre,
we should hardly be surprised that they tend to grow in depth—that is,
in the total amount of information collected on the average individual
with whom they deal. But surveillance systems also tend to grow
laterally—to broaden the variety of sources of personal data they rely
on in making those discriminations, especially through symbiotic
exchanges with similar systems.1
B. Tradeoff Questions: Two Types of Negative Externalities
The pursuit of economic efficiency raises two concerns. In explaining
these concerns, we introduce two fundamental tradeoff questions.
1
JAMES B. RULE, PRIVACY IN PERIL 18 (2007).
First, the pursuit may fail. In a market economy, a rational, purely
profit-motive-driven business will spend money to avoid harming another
only if harming that person also harms the business. Absent business harm,
the profit-maximizing strategy is to spend nothing to protect the other party.
We argue that consumers’ privacy harms typically do not impose sufficient
corresponding harms on businesses. To describe and analyze this situation,
we introduce the classic economic concept of an externality. An externality is
an effect of a decision on those who did not make the decision and whose
interests were not taken into account in making the decision. If the effect is
beneficial, the externality is labeled positive. If the effect is harmful, the
externality is labeled negative. Consumers’ privacy harms tend to be
negative externalities.
We distinguish between two types of negative externality—those that
are, and those that are not, meaningfully quantifiable. Many externalities are
meaningfully quantifiable. For example, we can, to a reasonable extent,
assign a dollar value to the time, effort, and monetary losses from identity
theft. We illustrate the point with cases in which a business’s efforts at price
discrimination yield less quantifiable gain than quantifiable loss from negative
externalities (they are notorious examples). These cases are examples of
quantifiable inefficiency. One key question in what follows is how best to
eliminate quantifiable negative externalities in order to achieve quantifiable
efficiency.
Many negative externalities are not, or are not easily, quantifiable. In
some cases, we simply have not done the research necessary to quantify
losses that are, in fact, quantifiable. This is certainly the case for identity
theft. The few studies that do exist are based on very small samples, and
offer conflicting conclusions. On the other hand, there are losses that resist
quantification. Identity theft again provides an example. How does one
quantify the sense of personal invasion and violation many feel as a result of
identity theft? It is extremely difficult to do so in any meaningful way. We
review and criticize attempts to do so; we cover basic risk assessment and
its pitfalls. Assuming the existence of non-quantifiable losses, we are often
confronted with a tradeoff between achieving quantifiable efficiency and
sacrificing efficiency, so understood, to avoid a non-quantifiable harm. How
do we make such tradeoffs?
C. How Should We Respond?
The previous section identified two problems:
How do we eliminate quantifiable negative externalities in order
to achieve quantifiable efficiency?
How do we make tradeoffs between quantifiable efficiency and
non-quantifiable privacy harms?
We suggest that, in answering these questions, we should be guided by
answers to four other questions. We do not claim that these are the only
relevant questions, just that focusing on them is helpful here.
First: What combination of precautions efficiently prevents the loss?
By “efficiency” here we mean quantifiable efficiency.
Second: What combination of precautions best prevents nonquantifiable losses? There is considerable disagreement over what counts as
a relevant loss. Should one’s sense of being violated by identity theft be a
serious harm that public policy should take into account? Or, is one just
over-sensitive? There is also considerable disagreement over what counts as
the “best” way to prevent the harms. We emphasize that norms reduce
some of this disagreement. Norms that govern how information should be
collected, used, and distributed balance privacy against a variety of
competing concerns, and, to the extent such norms are widely accepted,
disagreement over the proper balance is reduced.
Third: Who is responsible for the loss? We discuss various grounds
for assigning responsibility.
Fourth: To what extent should those at risk insure against the loss?
Where it costs more to avoid a loss than it does to recover from it, it does
not make sense to take the excessively expensive precaution. The
availability of insurance can make recovering from the loss less expensive
than costly precautions.
Ideally, with regard to losses for which insurance is not appropriate or
available, those responsible for the losses take precautions against them
which are (1) the best way to trade off quantifiable efficiency against nonquantifiable losses, and (2) within the limits of that tradeoff, the best way to
maximize quantifiable efficiency.
We need a name for these questions; call them the “framework
questions.” Society’s answers to the framework questions will contribute to
the development of privacy norms.
We will later consider three particular contexts in which these
questions arise: malware, software, and network attacks.
Download