Privacy - MIT - MIT Computer Science and Artificial Intelligence

advertisement
A FRAMEWORK FOR PRIVACY PROTECTION
Christian Baekkelund
David Kelman
Ann Lipton
Alexander Macgillivray
Piotr Mitros
Teresa Ou
December 10, 1998
TABLE OF CONTENTS
EXECUTIVE SUMMARY ........................................................................................................ i
DANGER ........................................................................................................................... i
GOALS ............................................................................................................................. ii
PROPOSAL ...................................................................................................................... iii
INTRODUCTION ....................................................................................................................1
SCOPE ..............................................................................................................................3
GOALS ..............................................................................................................................4
CURRENT AND PROPOSED REGULATORY FRAMEWORKS ..................................................6
TECHNOLOGY ..................................................................................................................7
Internet Protocol (IP) ................................................................................................7
IP Logging ...............................................................................................................9
The Future of IP Logging ..................................................................................10
IP Version 6 ...................................................................................................10
Anonymizing Routers ....................................................................................11
Anonymizing Proxies.....................................................................................12
Hypertext Transfer Protocol (HTTP) ....................................................................13
Cookies...................................................................................................................14
An Alternative to Cookies: Client-Side Targeting ............................................16
Other Web Technologies .........................................................................................17
Lightweight Directory Access Protocol (LDAP) Databases .................................17
JavaScript and VBScript ........................................................................................18
Java and ActiveX....................................................................................................19
E-Commerce .............................................................................................................20
Payment..................................................................................................................20
Digital Cash Protocol Type 1.............................................................................21
Digital Cash Protocol Type 2.............................................................................21
Delivery ..................................................................................................................23
P.O. Boxes .........................................................................................................24
Cryptographic Delivery .....................................................................................24
Trusted Third Party ............................................................................................24
Case Study: A Technological Solution ...................................................................25
Platform for Privacy Preferences (P3P)................................................................26
P3P Strengths .....................................................................................................28
P3P Problems .....................................................................................................31
LAW ...............................................................................................................................34
Constitutional Law and Common Law ..................................................................35
Common Law ...........................................................................................................37
Tort Law .................................................................................................................38
Contract Law .........................................................................................................39
Federal Statutes and Common Law .......................................................................41
Privacy Act of 1974 ................................................................................................43
Privacy Protection Act (PPA) of 1980 ...................................................................43
Electronic Communications Privacy Act (ECPA) of 1986 ....................................44
Legislative Outlook ................................................................................................47
The European Data Directive .................................................................................48
MARKETS AND SOCIAL NORMS ....................................................................................51
Case Study: A Norms Solution ...............................................................................56
TRUSTe Strengths ..................................................................................................59
Problems ................................................................................................................60
PROPOSAL ..........................................................................................................................64
ENTITLEMENT TO INDIVIDUALS....................................................................................64
THREE EXCEPTIONS:.....................................................................................................66
Operational and Aggregated Data .........................................................................66
Opt-In by Contract ..................................................................................................66
AUTONOMOUS COMPUTER CONTRACTING PRIVACY TECHNOLOGY (ACCPT).........67
CRITIQUE ...........................................................................................................................73
TECHNICAL AND LEGAL COMPLEXITY ADDED BY ACCPT ........................................73
Technical Complexity for the Programmer...........................................................73
Technical Complexity for End-Users .....................................................................74
Legal Complexity for Programmers ......................................................................75
Legal Complexity for End-Users ............................................................................76
Public Access ............................................................................................................77
A Slower Internet .....................................................................................................77
LIMITATIONS OF ACCPT .............................................................................................78
International Conflict of Laws ................................................................................78
Profiling ....................................................................................................................80
Operational Data ......................................................................................................81
"Race to the Bottom" ..............................................................................................82
PHILOSOPHICAL CRITICISMS OF ACCPT ....................................................................82
No Data Collection ...................................................................................................82
Less Regulation ........................................................................................................84
CONCLUSION .....................................................................................................................87
APPENDIX A: TABLE OF PROPOSED LEGISLATION
APPENDIX B: EXAMPLE P3P PROTOCOL
A Framework for Privacy Protection
Executive Summary: Page i
EXECUTIVE SUMMARY
DANGER
The current direction of the information revolution threatens personal privacy.
According to the Federal Trade Commission, 92% of websites collect personal
information from their users. Add Internet Protocol (IP) address logging and that number
nears 100%. However, industry self-regulation has been an unqualified disaster with
very few sites providing notice of their data practices and default settings on devices
which implicate personal privacy set at their most lenient. Users do not own their
personal information and current informational practices point towards more abuse of
personal information rather than less.
Most large websites handle advertising through one of several large brokers. To
better target advertising, those brokers commonly track what websites a person visits, and
what that person does on each site. For instance, the first time a user downloads an ad
from DoubleClick (which handles ads for Altavista, Dilbert and many of other popular
websites), DoubleClick sets a cookie on the user’s hard drive. This cookie is used to
track the sites a user visits, keywords from Altavista searches and other information. If a
user makes a purchase on a site subscribing to DoubleClick, DoubleClick can potentially
match the user’s name to the cookie, and sell that profile to direct marketers. All of this
typically happens without the user’s consent or even knowledge.
Advertising and information brokers are not the only abusers. Individual
A Framework for Privacy Protection
Executive Summary: Page ii
companies store data on their customers and are free to use that data how they see fit.
Amazon.com, for example, keeps a record of all books individuals purchase. With
modern data mining techniques, Amazon.com could collaborate with other companies to
form a detailed profile of a person. As electronic commerce spreads, the amount of
information in these profiles will become increasingly complete, and the business of
buying and selling this information will become increasingly profitable. The more
comprehensive these profiles become, the greater the potential for misuse of the
information they contain.
GOALS
In the face of these threats to privacy, the current legal framework offers
insufficient protections to personal privacy, particularly against private entities. Our
proposal attempts to create a legal and technological architecture that addresses this
problem while satisfying the following goals:
1. Protect Individual Privacy: In the straightforward words of Al Gore, "You
should have the right to choose whether your personal information is
disclosed."
2. Enable Electronic Commerce: Electronic commerce is becoming a significant
part of the U.S. economy with sales expected to reach close to five billion
dollars this year and as much as one and a half trillion dollars by the year
2002.
3. Enfranchise Everyone: Too often privacy is only for the elite. Web privacy
A Framework for Privacy Protection
Executive Summary: Page iii
should be available regardless of wealth, technological, or legal savvy.
4. Encourage Diversity: We recognize that diversity and competition are the
lifeblood of the growth of the Internet and the World Wide Web. Any privacy
protection framework should encourage a diversity of technological solutions.
Similarly, a good privacy protection framework should not dictate the level of
privacy afforded each person. Since people have different privacy preferences,
any proposed framework should allow for a wide range of competing privacy
policies.
5. Enable Collective Privacy Choices: Communities have an interest in
protecting the privacy of their members and so should be able to make and
administer collective privacy choices.
Our proposal is limited in scope based on several assumptions. We do not
consider governmental privacy violations; issues relating to children’s privacy; sniffing,
snooping and other transmittal privacy violations. We do not expect to solve related
problems dealt with by other subject matter white papers, such as international
sovereignty issues, but do hope to identify how those problems affect our proposal.
Furthermore, our proposal relies on a number of technological and legal structures. For
example, we assume that certification and encryption can make users identifiable and
contracts verifiable. These are further detailed in the text of our proposal.
PROPOSAL
Our proposal entitles individuals to ownership of personal information. We
A Framework for Privacy Protection
Executive Summary: Page iv
provide a legal and technological framework such that personal information cannot be
used by any other entity without permission from the individual. We further propose that
"default" settings on computer programs should not reveal personal information and that,
for a user to enter into a contract where the user’s information is given to an entity, the
user must be aware (either explicitly or through a transparent agent) of the terms of that
contract and be able to enforce those terms.
We begin by envisioning a legal framework whereby the use of information,
without permission, carries a statutorily determined minimum penalty. A user who feels
his or her rights have been violated may sue the offending company. This might be
administered through a government agency responsible for investigating the source of the
privacy violation -- an individual, for example, might only know that he or she had
received a solicitation without knowing which of the several companies with which he or
she had interacted had released information. Alternatively, the user could initiate her own
suit through a private right of action.
If a company wishes to use personal information about an individual, it is the
company’s responsibility to enter into a contract with that individual by asking for
permission to use the information and disclosing the intended purposes of the data
collection. This can be done either by personal, "manual" interaction, or through the use
of a computerized agent. In order for the contract formed by the computerized agent to be
legally binding, there must be some evidence that the client was aware of, and authorized,
choices made by its agent. If no such evidence exists, then the contract, or certain
provisions therein, would not be enforceable and the company would be potentially liable
A Framework for Privacy Protection
Executive Summary: Page v
for use of the personal information.
In order to remove some confusion about which autonomous agent contracts will
be enforced, we propose the Autonomous Computer Contracting Privacy Technology
(ACCPT) guidelines as a framework for enabling users to negotiate privacy preferences
with websites. In a basic ACCPT transaction, the client and server negotiate to reach a
mutually acceptable set of privacy practices. If no agreement is reached in an ACCPT
transaction, the server can log a failed attempt and sample proposals, but no additional
information about the transaction (the user’s IP address, etc.).
ACCPT incorporates the elements necessary to ensure that users are aware of the
activity of the computerized agents. To the extent that negotiation is governed by a set of
preferences entered by the user when the technology is first installed, the ACCPT
guidelines require that the agent informs users who choose the default settings of the
ramifications of the choice, and requires users to assent to that choice. Any company
interacting with an agent that identifies itself as ACCPT compliant can rely on the
validity of contracts made with that agent despite potential differences between the user’s
personal preferences and the preferences as expressed by the computerized agent. That
liability would be shifted to the authors of agents pledging ACCPT compliance (under
negligence standards for deviations from the ACCPT guidelines). If a program does not
identify itself as ACCPT compliant, the contracting parties retain liability for reliance on
invalid contracts. In that way, a company which collects information from an agent that
does not pledge ACCPT compliance is at its own peril and authors of agents that falsely
pledge ACCPT compliance are also in peril.
A Framework for Privacy Protection
Executive Summary: Page vi
Our proposal also encourages the development of other privacy protecting
technologies, such as a framework for anonymous browsing and electronic commerce.
This includes an addition to IPv6 for connecting to servers without disclosing one’s IP
address, an anonymous payment protocol and an anonymous shipping protocol (for
delivering goods ordered on-line).
A Framework for Privacy Protection
Page 1
INTRODUCTION
New technology must not reopen the oldest threats to our basic rights:
liberty, and privacy. But government should not simply block or regulate
all that electronic progress. If we are to move full speed ahead into the
Information Age, government must do more to protect your rights - in a
way that empowers you more, not less. We need an electronic bill of rights
for this electronic age.1
-Vice President Al Gore
The World Wide Web has grown exponentially from its proposal in March of
1989 as a way for physicists to keep track information at the European Laboratory for
Particle Physics (CERN).2 More than one hundred and fifty million people now use the
Web3 and any effort to count the number of individual sites is doomed to be outdated
before finished.4 Those sites represent a wide range of interests and ways of using the
Web but almost all share at least one characteristic: the collection of information. Ninetytwo percent of websites recently surveyed by the United States Federal Trade
Commission (FTC) collect information from their users.5 Add Internet Protocol (IP)
1
Al Gore, New York University Graduation Address (May 15, 1998) (last visited December 3, 1998)
<http://www.techlawjournal.com/privacy/80515gore.htm>.
2
See Tim Berners-Lee, Information Management: A Proposal (last visited December 7,
1998)<http://www.w3.org/History/1989/proposal.html> (proposing a hypertext system called "Mesh"
which was later renamed the "World Wide Web"). See also A Little History of the World Wide Web (last
visited December 7, 1998) <http://www.w3.org/History.html>.
3
See Nua Internet, How Many Online? (last visited December 8, 1998)
<http://www.nua.ie/surveys/how_many_online/index.html>. See also Gregory G. Gromov, History of the
Internet: Part 8 – Statistics, (last visited December 8, 1998)
<http://www.internetvalley.com/intvalstat.html> (estimating the number of US Internet users at 57 million
but admitting that there is some dispute about that figure).
4
But see Tony Rutkowski, Internet Trends <http://www.genmagic.com/Internet/Trends/> (last visited
December 8, 1998) (estimating the number of Internet hosts at ten million in early 1997).
5
See Testimony of Robert Pitofsky before House Subcommittee on Telecommunications, Trade and
Consumer Protection Consumer Privacy on the World Wide Web (July 21, 1998)
<http://www.ftc.gov/os/1998/9807/privac98.htm>.
A Framework for Privacy Protection
Page 2
address logging and that number nears one hundred percent. The U.S. government
believes in industry self-regulation but thus far self-regulation has been an unqualified
disaster. Fewer than fourteen percent of the sites surveyed by the FTC even give notice of
their privacy policies.6 The default privacy settings on the two most popular web
browsers are set at their most lenient. Users do not own their personal information and
current informational practices point toward more abuse of personal information rather
than less.
Consider the hypothetical surfer, Alfred. Alfred surfs from home and goes to a
wide variety of sites. Alfred checks the performance of his favorite technology stocks on
TheStockTickerWeb.Com. Being an older man, Alfred looks for information on the
hereditary nature of prostrate cancer at ProstrateInfo.Com and checks out the job listings
at a web publishing site called CabledNews.com. He buys a few books on gardening at
Books4U.com and has them delivered. Even if Alfred has been careful, a large quantity
of personal information has been collected from these sites and others. Alfred may be
surprised when he receives an e-mail from TradeEase.com telling him that anyone who
invests in tech stocks should subscribe to their insider’s guide. Or, come Monday
morning, when he finds the company policy on job-seeking while still employed on his
desk. Or when he receives advertisements for a new prostrate cancer wonder drug by
mail at his home. Or when his HMO calls him in and talks about raising his rate if
prostrate cancer is in his family. However, all of these nightmare scenarios could come
true today. Technology enables it, law allows it, the market encourages it, and social
6
See FTC, PRIVACY ONLINE: A REPORT TO CONGRESS (1998), available in
<http://www.ftc.gov/reports/privacy3/exeintro.htm#Executive Summary>.
A Framework for Privacy Protection
Page 3
norms constrain only the most flagrant privacy violations, and then only as long as the
public’s attention affords.
This paper will describe the current privacy framework in the United States and
outline why neither technology, nor law, nor the market, nor social norms adequately
regulate the privacy practices of data collectors. It will examine two current privacy
solutions as case studies of how the current privacy framework is inhospitable to personal
privacy. It will propose a new framework for privacy protection and discuss some
possible critiques of the proposed framework. But first we must be clear on the scope of
the project and the goals we hoped to achieve in crafting this proposal.
SCOPE
This white paper puts forward a privacy framework that would protect the privacy
of individuals on the World Wide Web. As such, it has a wide focus but does not attempt
to solve every problem of individual privacy. In particular, we do not discuss workplace
privacy, as it is a rapidly evolving field with issues specific to the workplace that have
already been well commented on by many authors.7 We do not discuss governmental
invasions of privacy because some protection already exists in this area.8 We likewise do
not attempt to provide a framework for the protection of personal data from interception
in transit or unauthorized access in storage because these too are covered by existing
7
See, e.g., Kevin Kopp, Comment: Electronic Communications In The Workplace: E-Mail Monitoring And
The Right Of Privacy, 8 Seton Hall Const. L.J. 861 (1998); Kevin J. Konlon, The Kenneth M. Piper
Lecture: Privacy In The Workplace, 72 Chi.-Kent. L. Rev. 285 (1996); Kevin J. Baum, Comment: E-Mail
In The Workplace And The Right Of Privacy, 42 Vill. L. Rev. 1011 (1997).
8
See infra text accompanying notes 73-85, 109-110.
A Framework for Privacy Protection
Page 4
privacy regulations.9 We also do not specifically target children’s privacy issues which
have received much attention in this Congress, though our proposal is adaptable to their
needs. Instead, this paper aims to propose a framework for individual privacy protection
from private entities who collect data on the World Wide Web.
GOALS
Within the scope of individual privacy on the World Wide Web, there are many
possible frameworks that would reflect different goals. The current government stance on
privacy has allowed the goals of business to be the basis for the privacy framework in this
country. Our proposal follows directly from a different set of goals. These goals reflect
our biases and are listed below in order of their importance to us in crafting this proposal.
1. Protect Individual Privacy: In the straightforward words of Al Gore, "You
should have the right to choose whether your personal information is
disclosed."10
2. Enable Electronic Commerce: Electronic commerce is becoming a significant
part of the U.S. economy with sales expected to reach close to five billion
dollars this year11 and as much as one and a half trillion dollars by the year
2002.12
9
See infra text accompanying notes 111-116
10
See Gore, supra, note 1.
11
See NUA Internet, Online Porn Industry Facing Crisis (last visited Dec. 8, 1998)
<http://www.nua.ie/surveys/?f=VS&art_id=905354484&rel=true> (citing Forrester Research).
12
See NUA Internet, Paradigm Shifts (last visited December 8, 1998)
A Framework for Privacy Protection
Page 5
3. Enfranchise Everyone: Too often privacy is only for the elite. Web privacy
should be available regardless of wealth, technological, or legal savvy.
4. Encourage Diversity: We recognize that diversity and competition are the
lifeblood of the growth of the Internet and the World Wide Web. Any privacy
protection framework should encourage a diversity of technological solutions.
Similarly, a good privacy protection framework should not dictate the level of
privacy afforded each person. Since people have different privacy preferences,
any proposed framework should allow for a wide range of competing privacy
policies.
5. Enable Collective Privacy Choices: Communities have an interest in
protecting the privacy of their members and so should be able to make and
administer collective privacy choices.
<http://www.nua.ie/surveys/analysis/weekly_editorial/archives/issue1no49.html> (quoting Nicholas Lippis,
an industry analyst).
A Framework for Privacy Protection
Page 6
CURRENT AND PROPOSED REGULATORY FRAMEWORKS
Cyberspace is regulated in a variety of ways.13 Law obviously can affect behavior,
even in cyberspace, as the courts are becoming more familiar with the space. Sites react
to lawsuits14 and courts dictate the ways in which certain sites function.15 Market forces
affect the space too. Popular sites get financial backing and can afford to hire the best
graphic designers. Similarly, lucrative content16 or business models17 breed many sites.
Social norms further affect the space as sites appear to serve different communities18 and
fads dictate website style.19 Finally, the most important effect on cyberspace is that of its
13
See Lawrence Lessig, CODE, AND OTHER LAWS OF CYBERSPACE, unpublished circulating draft (1998).
14
See, e.g., Washington Post Co., et al. v. TotalNews, Inc., 97 Civ. 1190 (S.D.N.Y. Feb. 2, 1997). In early
1997, TotalNews.com, a clearing house site for current news stories from a variety of online media sources,
was sued by some of those companies, including the Washington Post. See id. The plaintiffs objected to
the TotalNews.com’s "framing." A "framed" site is a site whose content is opened within another sites’
pages. TotalNews framed the Washington Post’s stories within its own pages so that users saw Washington
Post stories as well as Total News advertisements and menus in the same web browser window. The
plaintiffs alleged unfair competition, misappropriation, trademark dilution and infringement and other
violations. As part of a settlement, TotalNews.com agreed to stop framing the plaintiffs' sites.
15
See e.g. Shetland Times Ltd. v. Wills, Scot Sess-Case, 1 EIPLR 723 (1996). In late 1996 The Shetland
Times sued its former editor, Dr. Jonathan Wills, over links from his new web based newspaper, The
Shetland News, to stories on the Shetland Times site. See id. See also Simulation of Shetland News Pages
(last visited Nov. 21, 1998) <http://www.shetland-times.co.uk/st/newsdemo/index.htm> (showing how the
Shetland News page of October 14, 1996 linked to Shetland News Articles). An interim judgement held
that it was prima facie arguable that using headlines from the Shetland Times’ website to link to the stories
on that site could be a breach of copyright. See J.P. Connolly et al, Fair Dealing in Webbed Links of
Shetland Yarns, 1998 J. INFO. L. & TECH. 2. A settlement after the ruling allowed The Shetland News to
continue deep linking to individual stories on the Shetland Times site as long as the News identified the
stories as being from the Times website in the link text. See id.
16
Pornography revenue is expected to top one billion dollars this year. See NUA Internet, supra note 11.
17
The portal frenzy is just on example of this phenomenon. See Daniel Rimer, Let the Portal Games Begin,
C-NET NEWS, June 23, 1998 <http://www.news.com/Perspectives/Column/0,176,211,00.html>.
18
See, e.g., Planet Out: Land On It! (last visited December 8, 1998) <http://www.planetout.com/> (whose
slogan is "The Number One Online Community of Gay, Lesbian, Bi and Trans People Worldwide"); Why
America Online Sucks (last visited December 8, 1998) <http://www.aolsucks.com/>.
19
The ill-fated Netscape HTML extension <BLINK> tag is a good example of this as is the C-Net multicolumn look that has spawned vast number of similar layouts. See Brian Wilson, Index Dot HTML: Blink
A Framework for Privacy Protection
Page 7
technical architecture. Those characteristics that differentiate the web from chat or a realspace library are functions of its unique architecture.
It is important to realize that all of these forces are fluid, and, in the next section,
we propose changes to the legal regime and technical architecture in order to effect
changes in the market and social norms. However, we must first understand why the
current technical, legal, market and social forces do not offer sufficient protection for
individual privacy and that is the subject of this section.
TECHNOLOGY
The current state of Internet and World Wide Web technology provides ample
opportunity for the collection of personal information about a user. More importantly, the
user possesses relatively little control over the flow of this information. One of the most
obvious privacy threats comes from information freely handed over by the user – filledout surveys, purchases with credit cards and mailing addresses, etc. This information,
amassed, stored, and traded, can present a significant privacy threat. Other less visible
privacy threats, that users may or may not be aware of are embodied in the technical
architecture of the Internet and the World Wide Web.
Internet Protocol (IP)
Internet Protocol (IP), as its name suggests, is the fundamental protocol of the
Internet. Any computer connected to the Internet, despite the nature and composition of
(last visited December 8, 1998) <http://ec.msfc.nasa.gov/html/tagpages/b/blink.htm>; Welcome to CNET!
(last visited December 8, 1998) <www.cnet.com>.
A Framework for Privacy Protection
Page 8
the computer’s local subnet (the local network of which it is a part) has an IP address. A
computer on the Internet may have a static IP address, in which case it is always the same
unique address. Alternatively, a computer may have a dynamically allocated IP address,
which is assigned (usually on a per-session basis) from a pool of addresses available to
the computer’s subnet.
IP addresses currently consist of four one-byte (i.e., ranging from 0 to 256)
numbers, normally delimited by periods (e.g., "127.0.0.1"). A subnet may use IP
addresses that are either Class A, Class B, or Class C addresses. The class of the
addresses determines how many total IP addresses a subnet possesses. In a Class A
address, the first one-byte number in the address is identical for all machines on the
subnet (e.g., all MIT IP addresses start begin with the number 18). In a Class B address,
the first two numbers in the address are identical for all machines on the subnet. In a
Class C address, the first three numbers are identical.20
IP addresses are equivalent to the U.S. Post Office’s ZIP code plus four digits
numbering system; at any one time, only one computer should have any one address, and
anything sent to that address should not end up anywhere else instead. IP addresses are
used to ensure that data on the Internet is sent to the correct computer. For this reason, a
computer must indicate its IP address when making a request for data, or the responding
computer will not know where to send its reply.
20
See Gopher Menu (last visited Dec. 9, 1998) <gopher://gopherchem.ucdavis.edu/11/Index/Internet_aw/Intro_the_Internet/intro.to.ip/>
A Framework for Privacy Protection
Page 9
IP Logging
Computers that users interact with, such as web servers, will obtain the IP
addresses of their visitors. In most cases, these IP addresses are written to a log file. The
server machine can also easily indicate what IP address queried for what page. More
sophisticated IP logging systems might tie this information to other queries and data
acquired by other collection technologies, which are discussed below.
An IP address may betray certain information about a computer. At the very least,
in a subnet where IP addresses are dynamically allocated, an IP address will indicate the
general location of a computer. For instance, an IP address may indicate that a user is
from AOL or MIT. In many cases, with static IP address, an IP address alone can indicate
the identity of a user and tie him to other data collected about him.
One hypothetical scenario might run as follows: On one day a user with a static IP
address goes to Playboy’s website and looks at a playmate pictorial but does not do
anything to identify himself in any other way. His IP will still be tied to his picture
viewing session later on. If, at another time, he revisits the Playboy site but "only reads it
for the articles," and he decides to give his name and mailing address so that he can
receive Playboy’s new free "just the articles" mailing, he has really revealed his identity
for every visit to the site. Playboy’s server can now associate a certain name and mailing
address with the IP address it originated from. Thus, the user’s name can be associated
with the pictorial-viewing web session of the previous day. IP correlated profiles are not
foolproof, however. Under this method, the IP address only identifies the computer; a
A Framework for Privacy Protection
Page 10
different user of the computer may have visited the site for the pictorial query.
The Future of IP Logging
IP Version 6
The Internet is currently undergoing a major technological change. Many of the
protocols designed when the Internet was a fraction of its current size are beginning to
show signs of wear. The current version of IP (Internet Protocol Version 4) only has
provisions for 32 bit IP numbers.21 This places an upper limit on the number of
computers with unique IP addresses on the Internet at once. Although 232 is a fairly large
number, the assignment is fairly inefficient by design. For instance, MIT has more than
16 million IP numbers for less than 16 thousand people. The end result is that there is a
severe shortage of IP numbers. IP numbers sell for as much as $150 per year in some
areas.
To correct this (and several smaller problems), the Internet will convert to a new
version of IP, known as IPv6 (Internet Protocol version 6). IPv6 is a complete revamp of
the most fundamental layers of the Internet. IPv6 will increase IP size to 128 bits, giving
the astronomical number of 2128 IP addresses. It includes other improvements useful for
automatic configuration of machines, better routing of Internet traffic, better multicasting
(transmitting the same data to a large number of machines), better machine mobility,
maintaining large networks, transmitting encrypted data and a host of other tasks.22
21
See INFORMATION SCIENCES INSTITUTE, INTERNET PROTOCOL: DARPA INTERNET PROGRAM PROTOCOL
SPECIFICATION (1981), available in <http://info.internet.isi.edu:80/in-notes/rfc/files/rfc791.txt>
22
See NETWORK WORKING GROUP, INTERNET PROTOCOL, VERSION 6 SPECIFICATION (1998), available in
<http://info.internet.isi.edu:80/in-notes/rfc/files/rfc2460.txt>.
A Framework for Privacy Protection
Page 11
With the introduction of IPv6, there will be about 3.4*1038 IP numbers. Within a
few years, almost all computers will have unique IP addresses. Businesses like
DoubleClick will no longer need cookies (see below) to track individual machines. The
proposals below attempt to counter this by allowing users to connect to servers without
disclosing their IP addresses.23
Anonymizing Routers
It has been suggested that the adoption of IPv6 should also include an
anonymizing router technology to allow anonymous connections. Each router between
the client and the server would mangle the IP by marking what router it came from,
adding a few random bits, and encrypting it with a private key. When the web server
responded, each router would unmangle the IP, pick out the bits indicating what router it
came from, and send it to the previous router. Keys (and their corresponding mangled IP
addresses) would expire after a preset amount of time (~24 hours). Alternatively, instead
of encrypting, routers could store IP addresses in memory, and replace them with a
reference to the IP’s location in memory.
Although mangling and unmangling IP addresses may seem like an unnecessary
burden on routers, encryption and decryption is fast, and the actual difference in cost
would be marginal. Some Cisco routers can already encrypt packets (although not for the
purpose of anonymity),24 and there are extensions to most operating systems that allow
23
24
See Eric Osborne, Linux IPv6 FAQ/HowTo (last visited Dec. 9, 1998) <http://www.terra.net/ipv6/>.
See Cisco Encryption Technology (last visited Dec. 9, 1998)
<http://www.cisco.com/univercd/cc/td/doc/product/software/ios120/12cgcr/secur_c/scprt4/scencryp.htm>.
A Framework for Privacy Protection
Page 12
for ordinary computers to do the same.25 Simple encryption chips easily achieve speeds in
excess of 177 Mbits/sec.26 On the low-end, instead of encryption, IP addresses could
easily be stored in random slots in a cheap memory table.
The adoption of anonymizing routers could potentially cause the tracking of
"crackers" (individuals that break into other computers) to be significantly more difficult.
Right now, IP logs can be used to track many attacks. With anonymizing routers, law
enforcement would need to ask (or possibly subpoena) every router between the cracker
and the compromised machine to unmangle the IP and figure out where the attack
originated. However, experienced crackers already conceal or erase their IP addresses
from logs. It does not take long for amateur crackers to pick up techniques from experts.
Anonymizing Proxies
Anonymizing proxies serve the same purpose as anonymizing routers, but use a
different technology. Instead of connecting directly to a web server, the web browser
connects to a trusted proxy and asks it to retrieve a web page. The proxy retrieves the
web page, and sends it back to the browser. The web server only knows the proxy’s IP,
and not the user’s. Since anonymizing proxies are shared by a number of users and do not
transmit any individually identifying information, web servers have no way of tracking
users going through proxies.
There are a number of disadvantages to anonymizing proxies. A malicious or
25
See S/Wan Online Interoperability Test (last visited Dec. 9, 1998)
<http://www.rsa.com/rsa/SWAN/swan_test.htm>.
26
BRUCE SCHNEIER, APPLIED CRYPTOGRAPHY 264 (1994).
A Framework for Privacy Protection
Page 13
compromised proxy could log all of its user's transactions. Although there are free
anonymizing proxies available on the web, they are not very popular for a number of
reasons. To begin with, few people know about them. Of those that do, most opt not to
use them because there is a noticeable lag, and making the appropriate configurations on
the web browser is a hassle. Further, because there are twice as many requests for every
page retrieved when proxies are used, universal adoption could significantly increase
traffic on the Internet. And if a significant number of individuals start using them, people
will no longer be able to afford to operate proxies for free.
However, proxies are already used in a number of places for other reasons.
Filtering proxies censor certain types of web content in libraries, schools, small
oppressive countries, and large oppressive businesses.27 Caching proxies remember pages
that were recently accessed by users, and can quickly retrieve them for other users,
increasing speed and reducing bandwidth.28 Firewalls that filter all packets sent to the
Internet frequently provide proxies for web content.29 Converting an existing proxy to
remove identifying information is easy, and anonymizing proxies are an excellent option
for places where another form of proxy is already in use.
Hypertext Transfer Protocol (HTTP)
Hypertext Transfer Protocol (HTTP) is the protocol used by web servers to fulfill
27
See e.g. FILTERING ASSOCIATES, PROXY SERVER FILTERING SOLUTIONS FOR ISP AND BUSINESS
NETWORKS (last visited December 9, 1998) <http://www.filtering.net/Products/Servers/servers.html>.
28
See THE NATIONAL JANET WEB CACHE SERVICE, INTRODUCTION TO WWW CACHING (last visited
December 9, 1998) <http://wwwcache.ja.net/intro.html>.
29
See e.g. UNICOMP TECHNOLOGIES, FIREWALL SECURITY (last visited December 9, 1998)
<http://www.firewallsecurity.com/microsoft/>.
A Framework for Privacy Protection
Page 14
user requests for web pages. One of the distinguishing features of HTTP (as opposed to
File Transfer Protocol or Telnet) is its instantaneous nature. There is no real connection
between a web server and browser during an HTTP session. The browser makes a
request, the server fills it and moves on to its next request. When your browser makes
another request, it does so as if it had never made the first. This is a good thing because it
reduces server load (the server does not need to keep a connection open with your
computer while you browse a page) but it is a bad thing because your browser must make
a new connection for every request and the server treats every request as unrelated to any
other. So-called "stateless" protocols are a problem for features like shopping carts or
password saving because such features require some memory of what happened in
previous requests from the same browser.
Cookies
Cookies were created to add “state” to HTTP. They are bits of information,
generally text, that may be stored on either a user’s computer (client-side cookies) or on a
web server that the user is visiting (server-side cookies). Client-side cookies may be
specified to be readable by only a particular machine (e.g., personal.lycos.com), only
machines in a certain domain (e.g., preferences.com), or any computer. The readability
scope of server-side cookies depends on the network and architecture of the server site
(for instance, data may be stored in a central LDAP database, discussed below).
Users, in general, have very little control over the use of cookies. Some web
browsers allow users to completely disable cookies (which prevents the user from taking
A Framework for Privacy Protection
Page 15
advantage of a large number of useful sites and services in the web industry), to accept or
reject them one by one (which can become very tiresome very quickly), or to accept
cookies which may only be read by the specific server which issued them. Users have no
control over server-side cookies. Although website providers have more control over
server-side cookies, they generally use client-side cookies for several reasons. To begin
with, web browsers automatically save client-side cookies between web sessions, and
browsers often keep a different set of cookies for each user on a computer. The
maintenance of server-side cookies between sessions, by contrast, generally requires a
user to log in to the server at the beginning of each session to identify himself, although
IP address logging, which may not indicate the same computer across multiple sessions,
may eliminate this requirement. Additionally, increasing the scope of readability for a
client-side cookie (e.g., for a whole domain) is much easier than for a server-side cookie.
Sharing server-side cookies often requires a larger investment in a particular server
architecture and intranet.30
Because of their persistence, as well as their user-level targetablility, client-side
cookies serve as excellent tools to track users and to tie together various pieces of data
collected about them. A server may assign a user a unique ID in a cookie. This may serve
the same basic identifying function as an IP address, but the ID is more likely to be
traceable to a single, unique user – although, unlike an IP address, a technically savvy
user may reject or even edit/hack client-side cookies. Once a user has a unique cookie ID,
he may be tracked across a site or even multiple sites in a domain. Then, if at any of these
See NETSCAPE, CLIENT-SIDE STATE – HTTP COOKIES (last visited Dec. 9, 1998)
<http://home.netscape.com/newsref/std/cookie_spec.html>.
30
A Framework for Privacy Protection
Page 16
sites a user provides other information about himself, such as an address, as in the
Playboy scenario, or an interest in a certain book on Amazon.com (by adding it to his
"shopping cart"), the new information can be tied together with the user ID and any other
information collected.
An Alternative to Cookies: Client-Side Targeting
DoubleClick, Preferences.com, and other targeted web advertisement companies
use cookies to direct ads at individuals based on their browsing habits. An alternative to
this system would be one in which the web browser would log browsing habits and pick
ads itself. Targeted ad companies would no longer need knowledge about individuals to
target advertising.
In the simplest version, an ad company would send multiple ads with lists of
keywords. The browser would pick the one most appropriate for the user based on
browsing habits and tell the ad provider which ad it picked through an anonymous
protocol, like the survey protocol described above.
In more advanced versions, an ad company could send a bit of Java bytecode that
the browser would run to decide between ads. The browser could also have a profile of
the user filled out by the user when the browser initially runs. If connections to the ad
provider are properly anonymized, the provider could have a conversation with the
browser to pick the best ad.
Guaranteeing that an ad company cannot tie back ad choices to individual
browsers is very important; otherwise, this technology could be more dangerous than
A Framework for Privacy Protection
Page 17
cookies. An ad company could ask very specific questions, log the results, and build a
much more detailed profile.
Other Web Technologies
Lightweight Directory Access Protocol (LDAP) Databases
Lightweight Directory Access Protocol (LDAP) was designed to provide quick
access to a directory or database of information over a network. An LDAP database, like
any database, can store information in, and look up information based upon, various
fields. Thus, an LDAP database might store users, their e-mail addresses, preferences,
mailing information, etc. LDAP is useful for collecting personal data on the Internet
because multiple servers within a website provider’s intranet may easily and quickly
access a central system of LDAP database servers. Because most modern-day, industrialstrength websites, actually involve multiple mirrored front end servers and multiple
layers of computers behind the front end in order to handle heavy loads, an LDAP-style
system allows users’ personal information to be collected, distributed, and stored more
easily.31
This technology has important implications for collection of private data. As
stated, LDAP databases can store a wealth of information, and the LDAP front end
protocol and implementation allows for rapid storage and retrieval of this information.
These properties alone greatly reduce the cost of collecting private information.
Additionally, because LDAP allows many machines to easily centralize data, collected
31
See Lightweight Directory Access Protocol (last visited Dec. 9, 1998)
<http://www.umich.edu/~dirsvcs/ldap/>.
A Framework for Privacy Protection
Page 18
personal information can be made available to a larger number of server machines and
websites. In essence, LDAP is a technology that provides a backbone for large scale
collection and sharing of users' private information. Without a technology like LDAP,
several servers, on the same site or different sites, might collect a large amount of
information about a user, but the machines would unable to easily collate the data. Such a
situation would be somewhat like the "fog of war," where many events are observed in
many different places, but the communication of information from various battlegrounds
to the generals in the army headquarters is difficult. LDAP provides the roads and
couriers to bring the information to the headquarters.
JavaScript and VBScript
JavaScript and VBScript are two similar technologies (the former developed by
Netscape, the latter by Microsoft), that direct web browsers to perform certain functions
and transfer certain information. When a user downloads a page from a server, that page
may contain these technologies, which instruct the browser to perform certain tasks either
automatically or upon certain user actions. Most data-collection methods that utilize these
technologies rely on bugs in the implementations and specifications of the scripts.
Although most or all of these bugs have been corrected in the latest versions of the
browsers that support JavaScript and/or VBScript, a large number of people are still using
versions that harbor security weaknesses. Thus, a brief discussion of these weaknesses is
warranted as part of the current state of technology.
Scripts can obtain information about a user by reading their web caches (pools of
A Framework for Privacy Protection
Page 19
recently visited web pages that a browser often keeps for better performance) or reading
data about a computer type or the files within it. This data is often sent back to the server
through a hidden form similar to a typical HTML form, except that all of the fields are
invisible. Needless to say, these methods for collecting personal data are sufficiently
shady that most or all mainstream commercial websites do not utilize them. Despite this,
the potential risk of privacy violations through script security holes is very high.
Java and ActiveX
Java and ActiveX are two somewhat similar (for our purposes) technologies (the
former developed by Sun, the latter by Microsoft) that allow small, contained programs
(applets or components), originating from a web server, to be executed on a user’s
computer. Java programs are usually restricted by a sandbox or a security manager. The
former is a sealed off environment on the user’s computer which contains no personal
data or files, and the latter is a more granulated system that allows programs to request -and be allowed or denied -- access to various personal computer resources to the extent
that they cannot collect any useful personal data. However, if the user explicitly grants
permissions to the Java programs, they may read or manipulate files and settings on the
user's computer, as well as send information to another machine on the Internet. Only the
latest versions of Java implementations provide a full security manager as opposed to a
sandbox.
ActiveX is based on the all-or-nothing web of trust system. A user downloads an
ActiveX component onto his system, and if he allows it to run, it has free reign on his
A Framework for Privacy Protection
Page 20
computer. ActiveX components can be signed by a "trusted entity" that vouches for the
identity of the original author, and sometimes, the security of the component. It is up to
the user to decide whether he trusts the author and the "trusted entity."
Despite the potential for extra security protection in the latest versions of Java and
ActiveX, the ability of a site to collect private information is still quite strong. In the end,
a user must trust those sites to which he does grant special access to protect the
confidentiality of the information they collect. Non-technological standards, such as
TRUSTe (see below), help to alleviate this problem to some extent.
E-Commerce
Payment
Credit cards are the most common method of payment on-line, but they have a
number of problems associated with their use. Users must disclose their names, card
numbers, and card expiration dates. Furthermore, credit cards leave an audit trail; the
credit card companies have a log of almost all purchases made on-line. To solve the
privacy problem (and more importantly, the potential problems of on-line credit card
fraud), cryptographers have come up with a number of secure, on-line payment protocols.
The goals of such protocols, as defined by T. Okamoto and K. Ohta, are:
ï‚·
Independence: Does not depend on physical location or objects (can be used online).
ï‚·
Security: Cannot be copied.
ï‚·
Privacy: No one can trace users.
A Framework for Privacy Protection
Page 21
ï‚·
Off-Line Payment: Does not require a permanent network connection for use
ï‚·
Transferability: Can be transferred between users (merchants no different from
normal users).
ï‚·
Divisibility: A large e-bill can be split up into smaller e-bills (and the reverse). 32
DigiCash, once the dominant player in this field, recently filed for Chapter 11
bankruptcy. Several other companies are currently developing smart cards, and in the
process, digital cash. Two proposed payment protocols for protecting privacy under a
digital cash regime are outlined below.
Digital Cash Protocol Type 1
In the first protocol, each denomination is tied to a public/private key pair.
Owners of that denomination hold the private key and a banking institution holds the
public key. When a buyer makes a purchase, the seller creates a private/public key pair,
and shows the buyer the public key. The buyer sends an anonymous, encrypted
instruction to the financial institution, requesting that the money be transferred to a new
public key. He signs the request with the old private key. The financial institution
publicly broadcasts a signature of the transaction, so the seller knows the money is
available under his private/public key pair.
This protocol meets the needs of independence, security, transferability,
divisibility, and privacy. It fails for smart cards and off-line payment.
Digital Cash Protocol Type 2
See T. Okamoto& K. Ohta, Universal Electronic Cash, in ADVANCES IN CRYPTOLOGY – CRYPTO '91
PROCEEDINGS 324-337 (1992).
32
A Framework for Privacy Protection
Page 22
Most cryptographic protocols designed work with smart cards are based on a
physical scheme:33 Alice prepares 1000 money orders for $100. She puts each, together
with a piece of carbon paper, in a different envelope. She hands the 1000 envelopes to the
bank. The bank verifies that 999 of the envelopes contain a money order for $100,
deducts $100 from Alice's account, and blindly signs the last one. Alice gives the money
order to a merchant. The merchant verifies the bank's signature, and accepts the money.
The merchant takes the money order to the bank, which verifies its own signature, and
gives the merchant the money.
Note that electronically, the bank can sign an "envelope" by signing a one-way
hash of the money order -- the "closed envelopes." The bank '"opens" them by requesting
that Alice show a money order, and verifying the hash on that order.
This protocol meets five of the six goals. Though Alice can easily copy the money
order and use it over, this can be corrected by adding a random uniqueness string to each
money order. When the bank receives a money order, it can verify that it has not received
a money order with the same uniqueness string before.34
Although this protocol prevents Alice and the merchant from cheating the bank, it
does not identify whether it was the customer or Alice who tried to cheat the bank. There
is an additional protocol that verifies this, although it is impossible to explain in detail
without a presenting a stronger cryptographic background. The basic idea is:
33
34
See SCHNEIER, supra note 26, at 117-18.
See id. at 118-19.
A Framework for Privacy Protection
Page 23
Alice, in addition to putting a uniqueness string on the money order, places on it a
series of 100 "identity pairs," designed in such a way that any one element of a pair does
not reveal her identity, but any matching pair can be combined to give her identity (name,
address, etc.). When Alice makes a purchase, the merchant asks Alice to reveal a specific
half of each identity string ("Show me the second string from the first pair, the first from
the second pair ..."). If Alice tries to cheat and copy her money order, the second time she
tried to use the money order, the merchant would ask for a different set of identity strings
(the chances of asking for the same set are one in 2100). The bank could match up two
identity pairs and confirm that Alice tried to cheat. If a merchant tries to cheat, he would
only have one set of identity strings, and the bank could tell it was the merchant.35
Perfect digital cash allows for significant abuses. For instance, with any one of the
protocols in the second set, consider the perfect crime:
Alice kidnaps a baby. Alice sends the law enforcement authorities an anonymous
note with the hashes of 1000 money orders, and a request that a bank sign them and
publish the signatures in a newspaper (or else the baby dies). The authorities comply.
Alice frees the baby. Unlike physical systems, there is no way to track Alice. She is free
to spend the money.36
Delivery
When goods are ordered on-line, they need to be delivered. Traditional delivery
35
36
See id. at 120-22.
See S. von Solms & D. Naccache, On Blind Signatures and Perfect Crimes, 1992 COMPUTERS &
SECURITY 581-83.
A Framework for Privacy Protection
Page 24
methods require the user to disclose his name and address. A number of anonymous
delivery methods exist that are designed to circumvent this problem. Notice the similarity
in the three methods to status quo, anonymizing proxies, and anonymizing routers for
delivering packets on the Internet.
P.O. Boxes
Classically, anonymous delivery has been handled through post office boxes. The
user is assigned a unique, untraceable number by the post office. Deliveries can be sent
by that number without disclosing the user’s identity.
But post office boxes are a poor solution. They are expensive, and most people
cannot afford to have a unique box for each transaction. Eventually, a person's box may
be tied back to his identity, and all of the previous transactions are compromised.
Cryptographic Delivery
The post office publishes a public key every month. When a user gives out his
address, he encrypts it, together with an expiration date and a few random bits, with the
post office’s public key. The post office can decrypt it and deliver to that address, but the
sender cannot trace it down to a specific person. Further, unsolicited mail can only be
sent for a short time, because the address expires. The encrypted data could also contain
the merchant’s name and address, so the post office would only deliver packages sent by
the merchant.
Trusted Third Party
A Framework for Privacy Protection
Page 25
The product is sent to a trusted third party, which forwards the product to the
intended recipient. This doubles the cost of shipping, and it is often not possible to find a
trusted party offering this sort of service.
Case Study: A Technological Solution
In recent times, the lack of good privacy on the Internet has been spotlighted and
examined following the pressure from numerous sources. The general Internet user’s
desire for stronger privacy protection as he uses the Internet37 and the impending
possibility of governmental intervention to help create stricter privacy controls38 have
driven many individuals, corporations, and organizations to try to create solutions to the
expressed privacy concerns. For example some individuals have created anonymizing
remailers for anonymous e-mail,39 and others have created anonymizing browsers or
proxies.40 Due to the difficulties users encounter trying to find websites’ privacy
practices, many individuals and organizations have made proposals to control the release
of their personal information.41
In June of 1997, Netscape Communications, Inc., FireFly, LLC., and Verisign,
Inc., proposed the Open Profiling Standard (OPS) to address concerns about privacy on
37
See Pitofsky, supra note 5.
38
See id.
39
See Ian Goldberg, David Wagner, & Eric Brewer, Privacy Enhancing Technologies for the Internet (last
visited Dec. 9, 1998) <http://www.cs.berkeley.edu/~daw/privacy-compcon97-www/privacy-html.html>.
40
See, e.g., Anonymizer, Inc. (last visited Dec. 9, 1998) <http://www.anonymizer.com>.
41
See FTC, supra note 6.
A Framework for Privacy Protection
Page 26
the web.42 OPS required that when a user goes to a website, the website may (via HTTP,
as was proposed) ask the user’s computer for information about that user. Any piece of
information about the user can be marked by the user as something his computer is
always allowed to freely distribute (ALLOW), something that should never be distributed
(DENY), or something the user should always be prompted to decide upon at the time of
request; this information would be kept locally on his/her computer on a "virtual business
card" or vCard,43 which contains certified information about the user as the user
chooses.44
Platform for Privacy Preferences (P3P)
While OPS has for the most part fallen by the way-side, another proposal under
development during and after the wake of OPS, adopting many of its features, is the
Platform for Privacy Preferences (P3P).45 This proposal, sponsored by the World Wide
Web Consortium (W3C), has been in development since October of 1997 and is still
under development today. The concept behind P3P is that one can set up a flexible
software agent to represent the user’s privacy interests that will then interact with
websites as user moves around the WWW to create agreements as to what information
the user agent, and thus the user, will give to a website, as well as how the website can
42
See NETSCAPE, OPEN PROFILING SPECIFICATION SUBMISSION (1997), available in
<http://www.w3.org/Submission/1997/6/>.
43
vCards are a technology created by Verisign, and thus constitute Verisign's main role in the OPS
relationship.
44
45
See Netscape, supra note 42.
See W3C, Platform for Privacy Preferences Project (last modified July 16, 1998)
<http://www.w3.org/P3P/>.
A Framework for Privacy Protection
Page 27
use this information.
Figure 1: A P3P pseudonymous interaction with a service46
A P3P session begins when a user makes a request for a URL (see Figure 1,
above). The contacted server then replies to the user with one or more machine-readable
P3P proposals encoded in Extensible Markup Language (XML)47 as part of the HTTP
headers. In this proposal, the organization maintaining the server makes a declaration of
its identity and privacy practices. This proposal, applicable only to the requested URL
and similar sites,48 lists what data elements the server would like to collect from the user
while the user is connected, how those data elements are to be used, and specifically,
46
JOSEPH REAGLE AND LORRIE FAITH CRANOR, CACM PLATFORM FOR PRIVACY PREFERENCES, (last
visited December 10, 1998) <http://www.w3.org/TR/1998/NOTE-P3P-CACM-19981106/>.
47
For more information on XML, see W3C, Extensible Markup Language (last modified Nov. 11, 1998) <
http://www.w3.org/XML/>.
48
Also called a realm.
A Framework for Privacy Protection
Page 28
whether those data elements will be used in a personally identifying manner.49
The proposal is then parsed by the user’s agent, which would most likely be an
extension of the user's browser/client software. The agent would compare the proposal to
the user’s privacy settings. If the user already has settings that would agree with the
proposal, the proposal may be automatically accepted If the user has no settings
appropriate to the proposal, the user’s agent may present the user with the proposal and
prompt for a response, reject the proposal outright, create an alternative proposal
(counter-proposal) and send it to the server, or ask the server for an alternative proposal.
Once a mutually acceptable proposal is found between the user’s agent and the server, an
agreement/contract is met, and the user will gain access to the service.50
P3P Strengths
P3P has a number of important, inherent strengths. Many of these strengths stem
from the fact that P3P was designed with flexibility in mind. In this respect, P3P is very
different than many of the other W3C meta-data activities, such as the Platform for
Internet Content Selection (PICS). PICS is used in the privacy domain to statically define
privacy practices (as well as content), and the only room for control based upon a PICS
tag would be a complete accept or a complete reject.51 P3P allows for far greater
freedom to negotiate, for multiple proposals may be sent by the service, and multiple
49
See Appendix B: Sample P3P Proposal
50
See W3C, Platform for Privacy Preferences (last modified Nov. 6, 1998)
<http://www.w3.org/TR/1998/NOTE-P3P-CACM-19981106/>.
51
See W3C, Platform for Internet Content Selection (last visited Dec. 9, 1998)
<http://www.w3.org/PICS/>.
A Framework for Privacy Protection
Page 29
counter-proposals and requests may be made by the user’s agent before an acceptable
contract is reached. There are also a number of points in the use of P3P where additions
or changes are possible. For example, one important possible addition is for the user
agent to store the proposal IDs (also called propIDs) that are transmitted on the
agreement of a proposal.52 The next time the user tries to access the site, the saved
propID need merely be sent with the user’s request for the service to show that a previous
contract has already been struck between the user and the service and the user desires to
use the same contract again.53
Another boon brought about through flexibility is the fact that the P3P
specification defines a standard set of base data elements54 that all P3P related software
should acknowledge, but provides a mechanism that will let services create new data
elements for users to consider. Also, the fact that the W3C is making P3P's design
generally available will encourage its adoption and improvement, fostering competition
and innovation.
During a P3P negotiation, the creation of Temporary Unique IDs (TUIDs) is a
mechanism that allows P3P to keep track of the state of the current negotiation: basic
information required to determine the current state of the negotiation, such as who had
last "spoken" (the user’s agent, the service, etc.) and the IP address of the user (such that
the service knows where to send data), etc. These TUIDs are necessary for the
52
See W3C, supra note 50.
53
See supra, Figure 1.
54
See W3C, P3P Base Data Sheet (last visited Dec. 6, 1998) <http://www.w3.org/TR/WDP3P/basedata.html>.
A Framework for Privacy Protection
Page 30
connection and negotiation to occur but are especially useful to websites trying to collect
aggregate information on the users that connect to their website.55 Pairwise Unique IDs,
or PUIDs, are an additional mechanism with which to give information to be collected in
aggregate to the service. If a user sets up his user agent to use PUIDs, additional
information about the agreement that has been reached will be transmitted to the service
for collection and aggregation.56 The combined use of PUIDs and TUIDs are a possible
improvement on the current cookies mechanism with which the website can collect
information in aggregate about its users. Also, to facilitate the communication of
personal information when it is desired by both the website and the user, such as in the
process of making an on-line purchase, the P3P user agent will store a base set of user
data elements locally that can then be sent easily while connected to a website via P3P.
In fact, the combined use of a data repository with PUIDs and TUIDs to replace cookies
is actually highly beneficial to businesses desiring to collect information on their users.
The cookie structure as it currently exists only allows cookies to exist for three months.
The use of these P3P features would offer a website more precise and longer lasting
information on the users of their service57, and it would do so, unlike cookies, in a
manner which the user will find acceptable – or it won't be done at all.
Another positive point in the design of P3P is its use of current existing
technological standards. The proposed transport mechanism of the P3P proposal and data
55
See W3C, Platform for Privacy Preferences Syntax Specification (last visited Dec. 6, 1998)
<http://www.w3.org/TR/WD-P3P/syntax.html>.
56
57
See id.
Telephone Interview by Christian Baekkelund with Joseph M. Reagle, Jr., Resident Fellow, Berkman
Center for Internet & Society, Harvard Law School (Dec. 3, 1998).
A Framework for Privacy Protection
Page 31
elements is an extension of HTTP 1.1, the standard protocol used for WWW information
transport.58 P3P also makes use of XML for "creating elements used to structure data"
and thus to communicate the machine-readable P3P proposals and counter-proposals.
The Rich Data Format (RDF)59 is used to provide a "model for structuring data such that
it can be used for describing Web resources and their relations to other resources" for the
data elements in the proposal.60 The use of these standards will make the development of
servers and user agents capable of making P3P negotiations come about more quickly
and easily. Also, the use of a harmonized vocabulary for the proposals means that there
will be no misinterpretation between the public and a service about what the privacy
practices described in the proposal actually mean, and will also enforce the requirement
of certain basic privacy practices to be revealed.61 Furthermore, the use of P3P does not
inhibit the use of other technological privacy solutions, such as anonymized browsing.62
P3P Problems
P3P has a number of faults as well, however. The design for P3P,63 now in its
58
For more information on HTTP, see NETWORK WORKING GROUP, HYPERTEXT TRANSFER PROTOCOL
(1996), available in <http://info.internet.isi.edu/in-notes/rfc/files/rfc1945.txt>.
59
For more information on RDF, see W3C, Metadata and Resource Description (last visited Dec. 9, 1998)
<http://www.w3.org/Metadata/>.
60
See W3C, supra note 50.
61
See W3C, P3P Harmonized Vocabulary Specification (last visited Dec. 7, 1998)
<http://www.w3.org/TR/1998/WD-P3P-harmonization.html>.
62
63
See W3C, supra note 50.
See W3C, Platform for Privacy Preferences Specification (last visited Dec. 6, 1998)
<http://www.w3.org/TR/WD-P3P/>.
A Framework for Privacy Protection
Page 32
third revision, has grown somewhat overly large, complicated, and unwieldy.64 This
excessive verbosity in the definition of the P3P communication syntax will be a burden
on the shoulders of developers desiring to write software to be used in P3P transactions,
and may even be a deterrent to some developers to begin to write such utilities at all.
Also, the extra amount of communication that must occur before a user accesses a
website in a "normal" fashion will create a much greater amount of network traffic and
load on web servers that will inhibit the overall speed and growth of the Internet. The
fact that P3P communications are not encrypted or protected in any other way also leaves
the user data transmitted via P3P open to third party interception.65
Implementation of P3P servers and agents, and issues regarding the penetration of
P3P, trouble the success of P3P. If agents are not easy to use, people will not want to
bother with the initial setup of a P3P user agent,66 nor will they want to deal with
proposals that cannot be decided by their agent and that request dialog from them. The
authors of popularly used user agents will be in an extremely important position in the
creation of the default profile with which the agents are distributed. Currently, Microsoft
and Netscape dominate the WWW browser market,67 and if either was to create a user
agent built into their browser, the default profile with which that browser/agent was
submitted would drastically change the privacy practices of the entire Internet. Finally,
64
Telephone Interview by Christian Baekkelund with Joseph M. Reagle, Jr., Resident Fellow, Berkman
Center for Internet & Society, Harvard Law School (Dec. 3, 1998).
65
See KENNETH LEE & GABRIEL SPEYER, PLATFORM FOR PRIVACY PREFERENCES PROJECT & CITIBANK
(1998), available in <http://www.w3.org/P3P/Lee_Speyer.html >.
66
67
See id.
See Stever Lohr & John Markoff, Netscape Bid Seen by America Online, N.Y. TIMES, Nov. 23, 1998, at
A1.
A Framework for Privacy Protection
Page 33
the biggest problem impeding the desired penetration of P3P is the lack of any
enforcement of the P3P negotiated contracts for privacy policy. Essentially P3P assumes
that after having reached an acceptable contract with a service, the user will then trust the
service to abide by that contract – an assumption that simply cannot be made.68
On a more basic level, the fact remains that websites currently have no incentive
to adopt P3P – the current legal regime allows them to collect information at-will,69 and
so they have no reason to expend time, effort, and resources to form contracts with
particular users to obtain their data. As long as websites do not adopt P3P, users will not
push for the product – and so long as users do not push for it, websites will not feel
pressure to adopt it. This point is discussed in more detail in the Markets and Norms
section.
If the market's current resistance to the adoption of P3P can be overcome, P3P has
a great chance of achieving its desired goals – to better protect the privacy of users while
supplying websites with a clean and fair amount of information on the user. In fact, both
Microsoft and Netscape have made numerous public statements as to their "commitment
to privacy," and it is widely believe that companies would create user agents for P3P,
upon the completion of the P3P specification. The most popular web server being used
today, Apache, has made statements affirming its commitment to P3P, as well.70 With
some mechanism in place to encourage these, and other sites, to make good on their
68
See LEE & SPEYER, supra note 65.
69
See Legal Section.
70
Interview by Christian Baekkelund with Daniel Weitzner (Dec. 8, 1998).
A Framework for Privacy Protection
Page 34
commitments, P3P could likely succeed.
LAW
There are currently no legal limits on the use of information gathered through the
methods described above in the absence of an explicit promise by the website to limit the
use of such data.71 To date, federal governmental response to the possibility of privacy
violations on the Internet has been to recommend industry self-regulation.72 However, a
review of the current legal patchwork regarding informational privacy provides some
insight as to how privacy is traditionally conceived, and protected, in the United States.
Moreover, the erratic development of privacy law in the "real world" may be indicative of
future difficulties with protecting privacy in cyberspace.
In the United States, privacy is protected by three mechanisms of law:
constitutional law, case or common law, and statutes. As the supreme law of the land,
the Constitution provides the framework within which the limits of privacy protection
71
For instance, Deirdre Mulligan from the Center for Democracy and Technology, pointed out this exact
problem before the Senate Committee on Commerce, Science, and Transportation, Subcommittee on
Communications. She states:
The Federal Trade Commission's jurisdiction over deception generally comes into play
where a company has inaccurately stated its information practices to consumers.
Therefore, those who make no statements about how the handle data are less likely than
those who inform consumers of data use to be prosecuted. While the FTC's legal
framework is likely to discourage self-regulatory activities, the FTC, the Department of
Commerce and the White House have actively encouraged and prodded industry to take
steps to protect privacy or be subject to legislation. The recent Geocities settlement offers
an example of the problem. Geocities was found to be in violation of their privacy policy.
Had they not posted a privacy policy it seems unlikely that the FTC would have sought
them out for action.
FDCH Congressional Testimony, September 23, 1998.
72
See Beth Givens, A Review of the Fair Information Principles: The Foundation of Privacy Public Policy
(last visited Dec. 7, 1998) <http://www.privacyrights.org/ar/fairinfo.html>.
A Framework for Privacy Protection
Page 35
operate. Federal and state statutes, operating within the confines of the Constitution,
explicitly regulate behavior and provide protection for privacy as conceived by the vox
populi. State and federal courts interpret statutes and apply constitutional principles,
generating case or common law, which, by precedent, govern future legal decisionmaking.
Constitutional Law and Common Law73
In 1791, the Bill of Rights was created to protect the people of the United States
from the government.74 Primarily through the First, Third, Fourth, and Fifth
Amendments, various "zones of privacy" are protected from federal75 government
intrusion.76 Although the right to privacy may not have been explicitly stated as a right
of U.S. citizens, various courts have interpreted several Amendments77 to provide
73
Because constitutional law by its nature requires interpretation by courts, it is almost impossible to
provide a coherent analysis of constitutional doctrine without cases applying the principles. Thus, the
discussion of constitutional law is supplemented by some case discussion.
74
Because the Bill of Rights was created to protect from the abuses of government power, it provides little
to no protection from private actors, although there is a small exception for when a private entity is
essentially acting in a governmental capacity. See e.g., Norwood v. Harrison, 413 U.S. 455 (1973)
(affirming that "a state may not induce, encourage or promote private persons to accomplish what it is
constitutionally forbidden to accomplish").
75
Although our focus is on efforts by the federal government to protect privacy, it is noteworthy that nine
states explicitly guarantee the right to privacy within their constitutions: Alaska, Arizona, California,
Florida, Hawaii, Illinois, Louisiana, Montana, and Washington. See Comment, E-mail in the Workplace
and the Right of Privacy, 42 VILL. L. REV. 1011, 1018-19 (1997) [hereinafter Workplace]. Of these nine
states, only California, extends protection of the right to privacy to not only public employees, but private
employees. See id at 1019.
76
77
See Griswold v. Connecticut, 381 U.S. 479 (1965).
Further discussion of the Third and Fifth Amendments are not necessary, as their context is rather
limited. For example, the Third Amendment is a prohibition against the quartering of soldiers in homes
without consent, i.e. protecting the privacy/sanctity of a citizen's home. See U.S. CONST. amend. III. The
Fifth Amendment protects an individual's right against self-incrimination – again, the right to withhold
speech. See U.S. CONST. amend. V. The Ninth and Fourteenth Amendments are also often included in
discussions of constitutional amendments protecting privacy. See Center for Democracy and Technology,
CDT's Guide to Online Privacy (visited Dec. 1, 1998)
A Framework for Privacy Protection
Page 36
important ancillary protections for privacy.
The First Amendment78 is the authority behind freedom of speech. Although the
constitutional prohibition on the regulation of speech usually concerns the freedom to
speak, the doctrine also includes the freedom not to speak.79 Although the right to
anonymity -- the right to withhold speech identifying oneself -- has been repeatedly
upheld in the context of the print media, it is still unsettled whether this protection will be
extended to on-line media.80 The First Amendment also provides for the right to free
assemblage. It has been interpreted to protect the "freedom to associate and privacy in
one's associations."81 For example, this has been extended to privacy in one's marital
relations.82
<http://www.cdt.org/privacy/guide/basic/index.html>.
78
"Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise
thereof; or abridging the freedom of speech, or of the press, or the right of the people peaceably to
assemble, and to petition the Government for a redress of grievances." U.S. CONST. amendment I.
79
See McIntyre v. Ohio Elections Commission, 514 U.S. 334 (1995) (affirming the right to conduct
anonymous political leafletting).
80
In 1996, the Georgia legislature enacted a statute prohibiting online users from using pseudonyms or
communicating anonymously over the Internet. See GA. CODE ANN. § 16-9-93.1 (Supp. 1997). In
response, the ACLU and the Electronic Frontier Foundation brought suit in September 1996 in the Federal
District Court for the Northern District of Georgia, and obtained a preliminary injunction against
enforcement of the statute. See ACLU v. Miller, No. (N.D. Ga. June 20, 1997). For more information, see
Jeff Kuester, Unconstitutional Georgia Internet Police Law (visited Dec. 2, 1998) <http://
www.kuesterlaw.com/kgalaw.htm>.
81
NAACP v. Alabama, 357 U.S. 449 (1958) (ruling unconstitutional an Alabama statute requiring the
registration of the NAACP membership list with the state).
82
Although courts have generally been reluctant to intrude on the right of privacy in one's bedroom, this
right has been, and continues to be, the subject of many battles. See Griswold v. Connecticut, 381 U.S.
479, 484 (1965) (striking down a Connecticut statute forbidding the use of contraceptives because it
conflicts with the exercise of the right to marital privacy); see also Roe v. Wade, 410 U.S. 113 (1973). But
see Bowers v. Hardwick, 478 U.S. 186 (1986).
A Framework for Privacy Protection
Page 37
The Fourth Amendment83 protects people from searches based on a "reasonable
expectation of privacy" test84 by excluding from criminal trials evidence obtained by an
unconstitutional search. While the vagueness of the term "reasonable expectation" has
foreseeably led to much litigation, the translation of the Fourth Amendment to the on-line
world adds additional wrinkles to the debate over the meaning of the Fourth Amendment.
Harvard Law School Professor Lawrence Lessig has lectured about this problem,
pointing out that technology will soon make it possible to conduct "burdenless"
searches.85 The Fourth Amendment prohibits "unreasonable" searches – and the newest
advances now bring to light the latent ambiguity behind the term. Does "unreasonable"
really refer to expectations of privacy or to the burden upon the individual searched?
Common Law
The Constitution provides protection for private citizens from the government; for
intrusions of privacy from non-state actors, common law has conferred some protection
as well.86 Two basic areas of common law have been invoked to preserve people's right
to privacy: tort law and contract law. Tort law involves wrongs committed by one person
to another; the courts have long recognized the tort of invasion of privacy. Contract law
83
"The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable
searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause,
supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or
things to be seized." U.S. CONST. amendment IV.
84
See Katz v. US, 389 U.S. 347 (1967) (establishing the reasonable expectation of privacy test).
85
Professor Lawrence Lessig, Lecture for Cyberspace and Social Protocols, Fall 1998.
86
At its earliest conceptions, however, the right to privacy was not widely accepted. See Roberson v.
Rochester Folding Box Co., 64 N.E. 442, 443 (N.Y. 1902) (rejecting right to privacy because past
American courts had not yet recognized a right to privacy, preferring instead to defer to the legislature's
power to create such a right). Later courts realized that a right to privacy did exist, emanating the various
guarantees in the U.S. Constitution. See supra text accompanying notes 73-85.
A Framework for Privacy Protection
Page 38
involves agreements between parties and may govern the level of privacy granted, either
explicitly (as in a contract for data that includes privacy provisions) or implicitly (as in an
informal employment contract).
Tort Law
William Prosser, the original draftsman of the Restatement (Second) of Torts,
delineated the four actions in tort for invasions of privacy as:
1) intrusion upon the plaintiff's seclusion or solitude, or into his private affairs;
2) public disclosure of embarrassing private facts about the plaintiff;
3) publicity which places the plaintiff in a false light in the public eye; and
4) appropriation, for the defendant's advantage, of the plaintiff's name or
likeness.87
There are many exceptions and defenses to the above claims,88 and it is often difficult in
the privacy realm to specify what damage resulted from the wrong,89 making these
doctrines very difficult to use in court.
According to Joel Reidenberg, most state courts recognize actions for at least one
87
See William Prosser, The Right of Privacy, 48 CAL. L. REV. 383, 389 (1960).
88
For example, information voluntarily disclosed or readily available from public sources is not protected
under any of these actions. See Joel R. Reidenberg, Privacy in the Information Economy: A Fortress or
Frontier for Individual Rights? 44 FED. COMM. L.J. 195, 223-24. (1992). Moreover, plaintiffs may have to
meet the standard of "offensiveness," i.e., the information revealed/publicized/misappropriated was
extremely private, or of showing that the information was publicly distributed. In the context of the privacy
violations envisioned by this White Paper, such a threshold is extremely difficult to overcome.
89
See id.
A Framework for Privacy Protection
Page 39
of these privacy invasions90 and some have codified aspects of the four actions in both
civil and criminal statutes.91 Yet Reidenberg notes that this is not the case in the majority
of states. New York, for example, only allows the misappropriation claim,92 and
Minnesota is reputed to disallow all four actions for privacy intrusion.93
Contract Law
Whereas tort law awards compensation for intrusions on privacy after the fact,
contract law attempts to set the terms of the "intrusions" beforehand. Most of the
measures suggested to protect online privacy, either by technology or by statute,
inherently rely on contract law – that is, they envision people contracting to "sell"
information about themselves in exchange for access. This means that all of these
measures are limited by all of the flaws that exist in contract law, like informed consent,
unequal bargaining power, and mandatory non-negotiable contracts (also known as
contracts of adhesion). For example, some websites offer basic (and superficial)
90
See Restatement (Second) of Torts, 652A app. reporter's note (1989 & Supp. 1990). See also Kelly v.
Franco, 391 N.E.2d 54, 57-58 (Ill. App. Ct. 1979) (holding that Illinois protects only against the
misappropriation of a person's name or likeness for commercial purposes); Strutner v. Dispatch Printing
Co., 442 N.E.2d 129, 134 (Ohio Ct. App. 1982) (holding that Ohio has not adopted false light privacy tort);
Kalian v. People Acting Through Community Effort, 408 A.2d 608 (R.I. 1979) (holding that Rhode Island
recognizes only limited common law rights of privacy), all cited in Reidenberg, supra note 88.
91
See, e.g., CAL. CIV. CODE @ 3344 (West Supp. 1991); FLA. STAT. ANN. @ 540.08 (West 1988); KY. REV.
STAT. ANN. @ 391.170 (Michie/Bobbs-Merrill 1984); MASS. ANN. LAWS ch. 214, @ 3A (Law. Co-op.
1986); NEB. REV. STAT. @@ 20-201 to -211 (1987); N.Y. CIV. RIGHTS LAW, @@ 50, 51 (McKinney
1990); OKLA STAT. ANN. tit. 21, @ 839.1 (West 1983); R.I. GEN. LAWS @@ 9-1-28 to 9-1-28.1 (1985);
TENN. CODE ANN. @@ 47-25-1103, 47-25-1105 (1988 & Supp. 1990); UTAH CODE ANN. @ 45-3-3
(1988); WIS. STAT ANN. @ 895.50(2) (West 1983), all cited in Reidenberg, supra note 88.
92
See, e.g., Gautier v. Pro-Football, Inc., 107 N.E.2d 485, 487-88 (N.Y. 1952); Anderson v. Strong
Memorial Hospital, 531 N.Y.S.2d 735 (N.Y. Sup. Ct. 1988)., cited in Reidenberg, supra note 88.
93
See ROBERT ELLIS SMITH, A COMPILATION OF STATE AND FEDERAL PRIVACY RIGHTS 29 (1988), cited in
Reidenberg, supra note 88.
A Framework for Privacy Protection
Page 40
information about their privacy policies and warn the user that use of the site is
conditioned upon acceptance of these practices.94 However, there is no opportunity for
the user to bargain over the terms of the contract – the terms are offered on a take-it or
leave-it basis. Because most sites have similar policies, or do not delineate their practices
at all, the user generally has only the choice between surfing on the terms of website
owners or not surfing at all. Further, there is little point to a user rejecting a particular
website over its privacy policies, as many other sites – which do not reveal their policies
– may use the user's data, and so any particular transaction has little value to the user.95
However, in real space, the legal issues involved in cases revolving contract terms
with explicit privacy protections are relatively uninteresting. Rather than presenting
novel issues of law, the translation of contract law from the real world to the cyberworld
would seem to lead to consistency and present little difficulty.96
In those cases where terms governing the protection of privacy have not been
explicitly delineated in contracts or agreements, the courts have been more divided.
Privacy implied in contracts mostly surfaces as an issue in workplace privacy, where
employees may or may not have waived their right to privacy implicitly by agreeing to
94
For example, at the FX Networks site, the privacy policy reads: "Use and participation in FX's Internet
sites is contingent upon your acceptance of the following terms and rules. By using our sites, you accept
these terms. In addition, your continued use of this site after a change has been made to the policies of our
sites implies your assent to those changes." FX Networks, Terms, Conditions, and Privacy (last visited Dec.
7, 1998) <http://www.fxnetworks.com/fxm/htmls/copyright.html>.
95
See Jerry Kang, Information Privacy in Cyberspace Transactions, 50 STAN. L. REV. 1193, 1253-1255
(1998).
96
Granted, there is the current furor over the proposed UCC Article 2B. See, e.g., Brenda Sandburg, UCC
Talks Falling Apart, IP Magazine (Oct. 8, 1998) <http://www.ipmag.com/dailies/981008.html>. However,
this is outside the scope of the paper.
A Framework for Privacy Protection
Page 41
work for an employer.97 However, the trend appears to be that the courts will allow most
monitoring on the theory that there is no "reasonable expectation of privacy" in the
workplace.98 Techniques include video surveillance, keystroke monitoring, and
monitoring of e-mail and Internet usage.99
Federal Statutes100 and Common Law101
In 1977, recognizing the danger posed by new technology, Congress established
the Privacy Protection Study Commission in order to conduct an extensive study of
privacy rights in a world with computers.102 The Commission focused on record-keeping
in eight different areas: (1) consumer-credit; (2) the depository relationship; (3) mailing
lists; (4) insurance; (5) the employer-employee relationship; (6) medical care; (7)
investigative reporting agencies; and (8) education.103 The study concluded that, even
within the limits of law, privacy was not sufficiently protected from government, nor
from private industry.104 Despite this early warning, it is only recently that Congress
97
See Susan Stellin, Privacy in the Digital Age, CNET (May-June 1996)
<http://www.cnet.com/Content/Features/Dlife/Privacy/ss05.html>.
98
See Michael A. Smyth vs. Pillsbury Co., 914 F. Supp. 97 (E.D. Pa 1996).
99
As explained earlier, see Introduction, this is largely outside the scope of our focus. For a more complete
explanation, see Workplace, supra note 75.
100
For a good discussion of state laws protecting privacy, see Reidenberg, supra note 88, at 227-37.
101
In a manner similar to the above discussion of constitutional law, an analysis of disembodied statutes
and regulations is largely meaningless without application to real-life scenarios. Thus, case law discussing
the application of statutes is included in this section.
102
See U.S. PRIVACY PROTECTION STUDY COMMISSION, PERSONAL PRIVACY IN AN INFORMATION SOCIETY
(1977).
103
104
See id. at xiii.
See id.
A Framework for Privacy Protection
Page 42
began to move towards more protection of privacy.105
While the substantial number of Internet bills introduced recently in Congress
seems promising,106 very few bills have actually been enacted, suggesting that the
government is reluctant to make many changes to its policy on the Internet.
Unfortunately, Congress' inaction, particularly in the area of privacy, does not mean that
the status quo will prevail. The currently allowed level of "legitimate" access to
information allows a great deal of room for the quickly improving methods of collecting,
aggregating, organizing, storing and "mining" data. Without a radical move by the
legislature, novel cases and controversies will continue to arise, the legal vacuum will
have to be filled in "on a piecemeal basis by ad hoc judicial decisions, initiatives by state
attorneys general and other state and local regulators, and by restrictions imposed on the
United States from abroad"107 – providing little consistency and even less peace of mind
for online participants.
Legislation regarding privacy has been scarce and fairly spotty; legislation
regarding privacy online is even rarer. The government deals with particular areas on an
ad hoc basis, largely in response to public outrage,108 or as an afterthought in legislation
105
See Reidenberg, supra note 88, at 196-98. (1992).
106
While in the 104th Congress, only fifty bills directly related to the Internet were introduced, that number
has approximately doubled for the 105th Congress. See Nicholas W. Allard, Privacy On-Line: Washington
Report, 20 HASTINGS COMM. & ENT. L.J. 511, 515 (1998). In addition, and perhaps more illustrative of the
growing governmental interest in the Internet, Strom Thurmond (R-S.C.), the oldest member of the Senate,
became the 100th member of the bicameral Congressional Internet Caucus. See id.
107
108
See id.
For example, theVideo Privacy Protection Act of 1988, Pub. L. No. 100-618, 102 Stat. 3195 (1988) was
enacted to protect information about movie videos people buy or rent after such information was disclosed
during the confirmation process of Supreme Court nominee Robert Bork.
A Framework for Privacy Protection
Page 43
dealing mainly with other bodies of law. Moreover, most of the legislation has been in
reference to the government intruding on an individual's privacy. Very little protection
has been added for intrusions by private actors.
Privacy Act of 1974
Probably the earliest important piece of privacy legislation was the Privacy Act of
1974 (amended).109 It granted individuals the right to see, copy, and correct their federal
agency records, as well as to restrict disclosures of the information without consent. The
limitation is that the Privacy Act only applies to government actors.
Privacy Protection Act (PPA) of 1980
The government made another important move with the enactment of the Privacy
Protection Act (PPA) of 1980.110 The PPA prohibits law enforcement from searching or
seizing "work product" and "documentary" materials from journalists and publishers
unless they have probable cause to believe the person possessing the materials is involved
in a crime, and the materials sought are evidence of the crime. Thus, online systems and
their users are protected by the PPA "to the extent they act as publishers and maintain
publishing-related materials on the system." Again, this was a limitation against
government action, yet provided almost no protection from private entities.
109
Privacy Act of 1974, 5 U.S.C. §§ 552a-559 (1994).
110
Privacy Protection Act of 1980, 42 U.S.C. §§ 2000aa et. seq. (1980).
A Framework for Privacy Protection
Page 44
Electronic Communications Privacy Act (ECPA) of 1986
With the Electronic Communications Privacy Act (ECPA) of 1986, the
government finally took steps to limit the reach of private actors. Enacted to amend Title
III of the Omnibus Crime Control and Safe Streets Act of 1968, which limited the use of
telephone wiretapping, the ECPA covers all forms of digital communications, including
private e-mail. Title I of the ECPA111 prohibits unauthorized interception and Title II
governs unauthorized access to stored communications. The distinction between
messages in transmission and stored messages is important because transmitted messages
are afforded a much higher level of protection than stored communications; Title II
provides for lower fines, higher burdens of proof, and lower requirements for security.112
Courts, following the foundational case of Steve Jackson Games v. United States Secret
Service,113 have interpreted Title I very narrowly, leaving stored e-mail -- even where a
111
18 U.S.C. § 2510 et seq. (1998). See especially § 2510(4)("interception" includes the "acquisition of the
contents of any … electronic communication"); §2510(17)(defining "electronic storage" to include
electronic temporary, intermediate and backup storage); §2511(Title I)(prohibiting interception of
electronic communications); and, § 2701 (Title II)(prohibiting unauthorized intentional access to stored
electronic communications). See also § 2702(a):
(1) a person or entity providing an electronic communication service to the public shall not
knowingly divulge to any person or entity the contents of a communication while in electronic
storage by that service; and
(2) a person or entity providing remote computing service to the public shall not knowingly
divulge to any person or entity the contents of any communication which is carried or maintained
on that service…
But note that §2702(a) is narrowly construed such that only entities who provide electronic communication
services to the public at large (not just employees) qualify for the prohibition. See Andersen Consulting
LLP v UOP, 991 F. Supp. 1041 (N.D. Ill. 1998)(UOP’s divulging Anderson Consultant’s e-mail does not
fall within 2702 because UOP is not a e-mail provider to the public).
112
See Nicole Giallonardo, Casenote: Steve Jackson Games v. United States Secret Service: The
Government's Unauthorized Seizure Of Private E-Mail Warrants More Than The Fifth Circuit's Slap On
The Wrist, 14 J. MARSHALL J. COMPUTER & INFO. L. 179 (1995)(government can get stored e-mail with a
warrant but would need a court order for interception).
113
36 F.3d 457 (5th Cir. 1994)
A Framework for Privacy Protection
Page 45
user has not yet accessed it -- only Title II protection.114 It is questionable whether e-mail
packets "stored" on routers en-route between mail servers would get Title I protection,
but the courts have not yet discussed this issue.
Courts have been extremely reluctant to find that messages were intercepted
during transmission. Moreover, employers may avoid liability under Title I by showing
that the employee consented to the interception either expressly or by implication. Under
the "ordinary course" exception, an employer may intercept an employee's e-mail
communications in the ordinary course of its business if it uses "equipment or [a] facility,
or any component thereof" furnished by the provider of the electronic communication
service in the ordinary course of its business. However, legitimate purposes may include
monitoring to reduce personal use.115
A person violates Title II of the ECPA, the Stored Communications Act, if he or
she "intentionally accesses without authorization a facility through which an electronic
communication service is provided." Such unlawful invasions are more severely
sanctioned when the violation was motivated by "commercial advantage, malicious
destruction or damage, or private commercial gain." There are, however, two major
exceptions. There is no violation of the Stored Communications Act when the conduct is
authorized "by the person or entity providing a wire or electronic communications
service." Many legal commentators warn that this exception may be interpreted by the
114
See id; see also A. Michael Froomkin, The Metaphor Is The Key: Cryptography, The Clipper Chip, And
The Constitution, 143 U. PA. L. REV. 709, 864, note 676 (1995) (discussing the Steve Jackson case as
troubling in its narrow interpretation of the ECPA denying Title I protection to the e-mails, but pointing out
that the e-mails would still be protected by Title II of the EPCA).
115
See Susan Stellin, Privacy in the Digital Age, CNET (May-June 1996)
<http://www.cnet.com/Content/Features/Dlife/Privacy/ss03.html>.
A Framework for Privacy Protection
Page 46
courts to mean that private employers who maintain a computer system have the ability to
browse employee e-mail communications without violating the ECPA Title II.
Under the user exception or consent exception, the Stored Communications Act
does not apply to conduct authorized "by a user of that service with respect to a
communication of or intended for that user." Such authorization may be expressly given,
and in some cases, reasonably implied from the surrounding situation.
One final exception to the ECPA may be required by the Constitution. Since the
jurisdiction of Congress is limited to interstate commerce, the definition of "electronic
communications" under the ECPA is limited to communications and systems which
"affect interstate or foreign commerce." Therefore, a computer system which does not
cross state lines may not be covered by the ECPA. This would seem to exempt companyowned, internal e-mail systems.116
These acts and other similar laws protect areas as broad and diverse as the
interception and disclosure of electronic communications,117 protection for governmentmaintained databases,118 regulation of credit reports119 and financial records,120
116
There has been more written regarding employer monitoring of e-mail. Most articles conclude that
employees have little recourse and cite Bosses With X-Ray Eyes, for evidence that employers monitor email. See Charles Piller, Bosses With X-Ray Eyes, MACWORLD, July 1993, at 118 (21% of employers search
employee files, over 66% who do, don’t tell the employees). The best, most recent discussions are Anne L.
Lehman, Comment: E-Mail in the Workplace: Question of Privacy, Property or Principle? 5 COMM. LAW
CONSPECTUS 99 (1997) and Thomas R. Greenberg, Comment, E-Mail And Voice Mail: Employee Privacy
and the Federal Wiretap Statute, 44 AM. U.L. REV. 219 (1994).
117
See Electronic Communications Privacy Act of 1986, 18 U.S.C. § 2701 (1986).
118
See Privacy Act of 1974, 5 U.S.C. §§ 552a-559 (1994).
119
See Fair Credit Reporting Act of 1970, 15 U.S.C. §§ 1681-1681t (1994).
120
See Right to Financial Privacy Act, 12 U.S.C. §§ 3401-3422 (1995).
A Framework for Privacy Protection
Page 47
telemarketing,121 and the "Bork law" that protects video rental records.122 Unfortunately,
the industry-specific approach taken thus far by the government has not produced a
general protection for personal data online. Moreover, some experts contend that existing
provisions are inconsistent if not contradictory.123
Legislative Outlook
Unfortunately, it appears that a general protection for online information will not
be created any time soon. Except for a few pockets, such as encryption legislation124 or
protection of children,125 the probability of congressional action on any number of
privacy bills is rather low. Echoing the thoughts of other commentators, Nicholas Allard
conjectures that:
The immediate impact on Congress of the FTC's decision to give industry
(IRSG) self-regulation a chance to work, and the uncertainty over how this
approach can be reconciled with the European Union's privacy rules,
121
122
See Telephone Consumer Protection Act of 1991, 47 U.S.C. § 227(b)(1) (1991).
See Video Privacy Protection Act of 1988, Pub. L. No. 100-618, 102 Stat. 3195 (1988).
123
See Robert O'Harrow, Jr., Laws on Use of Personal Data Form a Quilt With Many Holes, WASH. POST,
Mar. 9, 1998, at A12, cited in Allard, supra note 106, at 529.
124
For example, "Secure Public Networks Act," see S. 909, 105th Cong. (1997), was passed by the Senate
Commerce Committee on June 19, 1997 and later amended slightly. The bill would require the registration
of keys with key recovery agents. On the other side of the spectrum, the Security and Freedom Through
Encryption Act (SAFE), see H.R. 695 105th Cong. (1997), opposes control over encryption and attempts to
impose key recovery; this bill has 249 cosponsors and has been placed on the calendar.
125
Despite industry opposition, several bills which would protect children appear close to being passed. In
the House, H.R. 1972, the Children's Privacy Protection and Parental Empowerment Act, see H.R. 1972,
105th Cong. (1998), would criminalize the selling of personal information about a child absent consent
from a parent. This measure currently has 52 cosponsors and is pending before the House Crime
Subcommittee. Moreover, S. 771, for example, mandates the labeling of advertisements and requires that
Internet service providers make available blocking software to filter out advertisements. See S. 771, 105th
Cong. (1997). The idea behind this legislation is that children are not able to discern between program
content and advertising. This measure is pending before the Senate Communications Subcommittee.
A Framework for Privacy Protection
Page 48
probably will derail any action on privacy legislation this year. 126
Regardless of the expectations for enactment, most of the currently proposed
legislation concerning privacy, abundant thought it may be, would not create a general
protection for online information. Instead it replicates the past governmental pattern of
providing industry-specific measures.
The proposed on-line privacy legislation can be divided up into five main
categories:127
1) Protection of Children
2) Protecting Personal Identification Data
3) Protecting Medical Records
4) Regulating Direct Marketing and Unsolicited E-Mail
For a more complete description of the proposed legislation, please see Appendix, Table
of Pending Legislation.
The European Data Directive
While the U.S. has been moving slowly on the issue of privacy, this is not solely a
domestic issue – and the U.S. needs to consider its role in the global information market.
126
127
Allard, supra note 106, at 533.
Encryption is not being addressed here. Bills and proposals regarding encryption are numerous enough
to warrant a separate paper.
A Framework for Privacy Protection
Page 49
The most urgent consideration, the European Union Data Directive,128 went into effect
October 25 of this year. The Directive sets out privacy standards to be observed in the
collection, processing, and transmission of personal data (standards that are rarely
observed in the US). European countries are also directed to prohibit transfer of personal
data to countries that do not "ensure adequate levels of protection."129 Although the
Directive was passed in 1995, it was only this past summer that U.S. government and
industry began to seriously entertain the thought that European Union would actually find
U.S. practices not up to European standards of data protection, and consequently restrict
data flow.130 In fact, until very recently, the U.S. response had been to lobby European
countries to convince them that the current U.S. policies were adequate.131
Article 25 of the Directive has caused a stir in the United States because it
prohibits transfers to countries that do not have similar privacy regulations giving
"adequate protection" to consumers. The most basic provisions of the Directive are quite
contrary to the status quo in the United States. For example, the Directive, if enacted into
law by an EU member state, would require that data collectors provide advance notice of
128
See European Parliament, Directive 95/46/EC (Oct. 24, 1995) [hereinafter Directive]
<http://www2.echo.lu/legal/en/dataprot/directiv/directiv.html>.
129
See id. at Article 25
130
See David Banisar, The Privacy Threat to Electronic Commerce, COMM. WEEK INT'L, June 1, 1998
<http://www.totaltele.com/cwi/206/206news13.html>.
131
See id. According to Banisar, in fact, the lobbying could have unfortunate results:
In order to put pressure on Brussels, the White House has sent envoys around the world to push
other countries, such as Japan, to not adopt privacy laws. They have threatened to take the EU to
the World Trade Organization, even though GATT specifically exempts privacy laws. It is hard to
imagine that Brussels would not find these actions and statements highly provocative and they
may respond strongly in October.
A Framework for Privacy Protection
Page 50
their intent to collect and use an individual's personal data;132 in addition they give that
individual access to her data,133 the means to correct it,134 and the ability to object to its
transfer to other organizations.135 Moreover, the enacted Directive would require that
data collectors maintain the security and confidentiality of the personal data and only
process personal data for specified, explicit, and legitimate purposes.136 More
particularly, the common use of Internet cookies in the U.S. may be especially
troublesome to the European Union Privacy Commission, which has previously stated
that "transfers involving the collection of data in a particularly covert or clandestine
manner (e.g. Internet cookies) pose particular risks to privacy."137 In theory, conflicts
such as these could keep the United States off the Privacy Commission's White List of
countries that provide adequate privacy protection.
Recognizing these dangers, various U.S. parties have begun to reconsider their
privacy policies. The Commerce Department proposed guidelines138 to help U.S.
companies meet the European Union's new privacy rules. The proposed guidelines
include: notifications to individuals about the collection and use of personal information;
132
See Directive, supra note 128, at art. 10.
133
See id. at art. 12.
134
See id. at art. 10.
135
See id. at art. 14.
136
See id. at art. 6.
137
The European Commission, First Orientations on Transfers of Personal Data to Third Countries:
Possible Ways Forward in Assessing Adequacy (last modified June 26, 1997)
<http://europa.eu.int/comm/dg15/en/media/dataprot/wpdocs/wp4en.htm>.
138
See C-Net News.com, U.S. Proposes Privacy Principles (Nov. 9, 1998)
<http://www.news.com/News/Item/0,4,28510,00.html?owv>.
A Framework for Privacy Protection
Page 51
instructions about how to limit or stop the use of that information; being told when
information is being sold, rented, or shared with a third party; disclosure about how
secure that information is from being stolen or misused, and instructions to individuals
about accessing any and all information a company has obtained about them.139 As of
late November, perhaps in recognition of the possible embargo in information, the Direct
Marketing Association had agreed to endorse conditionally a series of guidelines
proposed by the Commerce Department.140 What is encouraging is that these changes
have come about despite the fact that the European Union had agreed to allow
information to flow unimpeded between U.S. marketers and its member nations.141 How
long this state of affairs will last, however, is unknown.142
MARKETS AND SOCIAL NORMS
When Secretary of Commerce William Daley admonished businesses earlier this
year that protecting privacy was in their best interest, he was talking about the threat of
government regulation, not market forces.143 Businesses interested in collecting and
139
However, the administration and the EU still disagree on enforcement. The administration proposal
suggests the creation of independent' entities to resolve disputes between individuals and companies. It also
includes unspecified "consequences'' for firms not adhering to the guidelines. See id.
140
See Directnewsline, DMA Tentatively Backs Commerce Dept. EU Privacy Guidelines (last modified
Nov. 23, 1998)
<http://www.mediacentral.com/Magazines/DirectNewsline/Archive/1998112303.htm/Default>.
141
See Directnewsline, EU Agrees Not to Interrupt Data Flow for Time Being (last modified Oct. 28, 1998)
<http://www.mediacentral.com/Magazines/DirectNewsline/Archive/1998102801.htm/Default>. Despite
this concession, however, the EU has chosen to reject the proposed Commerce Department principles as
sufficient in the long run. See Fox News, EU Rejects U.S. Data Privacy Protection Plan (last modified
Nov. 23, 1998) <http://www.foxmarketwire.com/wires/1123/f_rt_1123_18.sml>.
142
For news as it becomes available in regard to the effects of the European Data Directive, see The
Privacy Page <http://www.privacy.org/> (last visited Dec. 7, 1998).
143
William Daley, Speech at Commerce Department Conference on Internet Privacy, June 23, 1998 (last
A Framework for Privacy Protection
Page 52
sharing personal information about their site visitors have never had it so good. The
collection, storage, analysis and sharing of information is cheap in today’s economy and
consumers willingly part with personal data in a market where divulging that data
without privacy protection is the norm.
As discussed above, all currently available web server software logs IP addresses
and various other pieces of information about every HTTP request it processes. In order
to collect additional information from web users, all a site operator needs to do is code an
HTML fill out form and CGI script to process its results. Many form examples and
tutorials are available on the web144 but form processing CGI programming used to be an
impediment to some users.145 Since free services now e-mail responses directly to the
form author, even that barrier has been eliminated.146 That means that setting up a
website that collects personal information is now no more expensive than setting up a
website that doesn’t.147 "Cookie" processing, as described above, is slightly more
visited December 1, 1998) <http://www.techlawjournal.com/privacy/80623dal.htm> (last visited December
1, 1998).
144
See Web Communications, HTML Form Tutorial (last visited Dec. 3, 1998)
<http://www.webcom.com/~webcom/html/tutor/forms/>.
See National Center for Supercomputing Aplications (NCSA), A Beginner’s Guide to HTML
<http://www.ncsa.uiuc.edu/General/Internet/WWW/HTMLPrimer.html> (last visited Dec. 3, 1998) (stating
that forms are not covered in this widely known primer because of the "of the need for specialized scripts to
handle the incoming form information").
145
146
See Response-O-Matic: Free Form Processor (last visited Dec. 3, 1998) <http://www.response-omatic.com/>.
147
Website set up and maintenance costs have also fallen sharply. Many website hosting companies charge
less than twenty dollars a month for unlimited access sites while other websites will host individual pages
for free. See HostSearch (last visited Dec.3, 1998) <http://www.hostsearch.com/> (listing many web
hosting packages under twenty dollars). See also Geocities (last visited Dec. 3, 1998)
<http://www.geocities.com/> (offering up to eleven megabytes of free web page space).
A Framework for Privacy Protection
Page 53
complicated but again free scripts are available on the web148 and the subject is covered
in many introductory programming books.149
Storage of the collected information is also very cheap and getting cheaper.
Storing the name, social security number, address, phone number, income, birthday, and
last 100 pages requested by a user would take approximately 5,000 bytes without any
compression.150 Using data compression algorithms, that number could be as low as 500
bytes.151 Current prices for hard drives mean that the data for approximately 8000 users
can be saved for every dollar spent.152 In short, the collection and sale of data is easy,
cheap, and extremely profitable.
Under the current legal regime – or lack thereof – and the current state of the
technology, information essentially "belongs" to whatever entity has access to it. The
user has little control over what information is revealed or how it is used, and very often
has only the choice between handing over information or not using the Internet at all. In
this way, the United States' lack of a legal framework has awarded to website owners an
"entitlement" to all information that they can obtain – leaving it to users to "buy back"
that information, if they can.
148
See e.g., Matt Wright, HTTP Cookie Library README (last visited Dec. 3, 1998)
<http://www.worldwidemart.com/scripts/readme/cookielib.shtml>.
149
See e.g., SHISHIR GUNDAVARAM, CGI PROGRAMMING ON THE WORLD WIDE WEB 202 (1996).
This number was arrived at using a group member’s personal data. Granted the number used for salary
saved some data space, however it was made up for by the excessively long name. The page references
used were the first ten pages referenced in this section of text.
150
151
152
PKWare’s PKZIP program was used for this example.
Based on storing an uncompressed 5,000 bytes per user and a hard drive cost of $25 per gigabyte which
is the current average price for large drives at Compu Tech USA, a computer retailer. See Compu Tech
A Framework for Privacy Protection
Page 54
In a perfect market, users would, theoretically, be able to "repurchase" their
information so long as the users valued the information more than the website did.153 In
the absence of transaction costs, users would be able to negotiate with each website to
persuade that site not to use the data it had collected. Unfortunately, the transaction costs
in this instance are huge: users cannot easily coordinate their efforts so as to exploit their
market power, websites are not actual "people" and so cannot be negotiated with (leaving
the users in a flat take-it-or-leave-it position), the costs of investigating the policies of
individual websites are high, and even if a user is able to "repurchase" information from a
single website, there will still be a myriad of other websites that the user will have to
negotiate with in order to truly "own" his or her information – otherwise, the individual
contract formed with the single website is valueless, because other websites will still be
able to trade the information freely.154
Industry self-regulation has been remarkably unsuccessful in large part because of
these basic realities.155 Though the Federal Trade Commission has long advocated a
system of self-regulation and has provided guidelines to industry to assist it in this
effort,156 the FTC reports that fewer than 14% of websites surveyed even posted basic
information about their privacy practices, and fewer than 2% actually had developed
USA (last visited Dec. 3, 1998) <http://www.computechusa.com/>.
153
See A. MITCHELL POLINSKY, AN INTRODUCTION TO LAW AND ECONOMICS 11-12 (1989).
154
See Kang, supra note 95, at 1256.
155
See FTC, supra note 6.
156
See FTC, Consumer Privacy on the World Wide Web (last visited Dec. 8, 1998)
<http://www.ftc.gov/os/1998/9807/privac98.htm>.
A Framework for Privacy Protection
Page 55
"comprehensive" privacy policies.157 The Electronic Frontier Foundation has also noted
that despite guidelines issued by the Direct Marketing Association regarding web user
privacy, most websites are stubbornly noncompliant.158 Given the profitability of
collecting personal information, as well as the costs of implementing a system of privacy
protections,159 these results are hardly surprising. In fact, whenever a particular company
does make basic assurances about protecting users' privacy, it is merely opening itself up
to liability should these policies be violated.160 A perfect example can be found in the
Geocities case: Geocities, a large provider of free web space, promised its customers that
their data would not be sold to any other entity. As it turned out, the data was sold, and
the Justice Department forced Geocities to modify its policies in exchange for dropping
the lawsuit. However, rather than simply agreeing to abide by its original privacy policy,
Geocities chose instead to discontinue its privacy guarantees.161
Some attempts have been made to solve these problems and allow for greater user
control over personal information. P3P, described above, is one such proposal.
Unfortunately, P3P on its own suffers from the fact that the current "free-market" regime
is very beneficial to corporate interests. Commercial websites own the information now
– this leaves consumers with precious little to bargain with, and it leaves websites with no
157
See id.
158
See EPIC, Surfer Beware II: Notice is Not Enough (last visited Dec. 8, 1998)
<http://www.epic.org/Reports/surfer-beware2.html>.
159
See Joseph M. Reagle, Jr., Boxed In: Why U.S. Privacy Self-Regulation Has Not Worked, Dec. 4, 1998.
160
See id.
161
See Joel Brinkley, Website Agrees to Safeguards in First On-Line Privacy Deal, N.Y. Times, August 14,
1998, at A15.
A Framework for Privacy Protection
Page 56
incentive to adopt P3P technology. Though, in theory, widespread consumer demand for
P3P might entice websites to begin to use it, such coordinated demand is unlikely arise
given general consumer ignorance, inertia, and the entrenched commercial interests that
have a stake in ensuring the system remains unchanged.
Case Study: A Norms Solution
In early 1996, Internet privacy concerns became so great and widespread that the
U.S. government began to consider possible legislation regulating privacy practices on
the Internet. Businesses on the Internet clearly did not think that this was in their best
interest, fearing that governmental intervention would not be researched properly, would
be too heavy handed, and would be too strict for businesses desires. Instead, efforts to
create privacy self-regulation came about in an effort to head off the need for
governmental intervention.162 After hearing a talk at Esther Dyson's PC Forum in March
1996, Lori Fena, Executive Director of the Electronic Frontier Foundation (EFF), and
Charles Jennings, founder and CEO of Portland Software, formed TRUSTe.163
TRUSTe is an effort to change the norms with regards to privacy on the Internet.
The idea is to create a "trustmark" that can be displayed by websites which acts as a
certification to the Internet public. TRUSTe is attempting to create a social norm that
162
See TRUSTe, For Web Publishers (last visited Dec. 9, 1998)
<http://www.truste.org/webpublishers/pub_privacy.html> ("The federal government is dead serious about
enacting stringent privacy legislation if the industry doesn't show NOW that it can self-regulate effectively
.... If you don't take advantage of self-regulatory measures such as the TRUSTe program for protecting
users' privacy, it is only a matter of time before government regulation takes over. Federal intervention will
absolutely translate into less flexibility and more costly privacy implementation measures for Websites
across the board.").
163
See TRUSTe, About TRUSTe (last visited Dec. 6, 1998)
<http://www.truste.org/about/about_truste.html>.
A Framework for Privacy Protection
Page 57
dictates if a user visits a site with the TRUSTe trustmark, he should feel confidence in the
stated privacy practices of that website. This desire to shift the norms to have people feel
secure while using the Internet, specifically while accessing a website that is a member of
the TRUSTe organization, is even present in TRUSTe’s trademarked motto: "Building a
Web you can believe in."164 The two key principles that TRUSTe is founded on are that
"users have a right to informed consent" and "no single privacy principle is adequate for
all situations."165
This shift of norms must be accomplished on two fronts: websites must join the
TRUSTe organization and abide by their rules, and users must understand what TRUSTe
is, what the trustmark means, and desire to access websites with the TRUSTe mark. To
join TRUSTe, websites must put together an easily-accessible public description of their
privacy practices that is linked to a clearly-displayed TRUSTe mark, and they must agree
always to abide by those privacy practices. TRUSTe members then pay a fee
proportional to their company’s annual revenue to the TRUSTe organization and thus
become official TRUSTe sites.166 TRUSTe sites are periodically audited by TRUSTe to
make sure they are in accordance with the TRUSTe rules and regulations, for it is in
TRUSTe’s best interest to promote faith in its mark by ensuring that the websites bearing
the mark be truthful and honest.
A TRUSTe site’s posted privacy statement must at a minimum describe seven
164
See id.
165
See id.
166
See TRUSTe, How to Join the TRUSTe Program (last visited Dec. 5, 1998)
<http://www.truste.org/webpublishers/pub_join.html> for details on how websites join TRUSTe and how
much money they must pay to join.
A Framework for Privacy Protection
Page 58
different areas of that website’s privacy policy.167 The first section explains what
information can or will be gathered about the user (e.g., e-mail address, site pages
visited), and the second section clearly identifies the entity that is making the collection .
The third section explains how any collected information will be used (e.g., for
anonymous, aggregate statistics, for personalization, for ad targeting). The fourth section
explains with whom the information might be shared (e.g., only divisions of the company
running the site, particular outside companies, the highest bidder). The fifth section
describes the "opt-out" policy of the site; this policy deals with those services and
collection schemes in which the user may not to participate. The sixth section explains
how a user may update or correct any information collected about him. The seventh
section explains the "delete/delist" policy. This policy deals with how a user may remove
his information or registration from a site.168
For TRUSTe to be successful, it must make the Internet public aware of, and
watchful for, the TRUSTe mark. Through public dissemination of information, such as
that easily accessible at the TRUSTe website,169 Internet users must be made to have faith
that websites will be true to their declared privacy practices. TRUSTe is continuously
striving to make people aware of what the trustmark means in an effort to have users
167
For good and interesting examples of this posted privacy policy, see C-NET (last visited Dec. 9, 1998)
<ttp://www.cnet.com/privacy.text.html>; Disney (last visited Dec. 9, 1998)
<http://www.disney.com/Legal/privacy_policy.html>; Prizes.com (last visited Dec. 9, 1998)
<http://www.prizes.com/privacy.phtml>; Wired (last visited Dec. 9, 1998)
<http://www.wired.com/home/digital/privacy/>; and Verisign (last visited Dec. 9, 1998)
,http://www.verisign.com/truste/>.
168
See TRUSTe, TRUSTe Program Principles (last visited Dec. 6, 1998)
<http://www.truste.org/webpublishers/pub_principles.html>.
169
See TRUSTe, The TRUSTe Program: How it Protects Your Privacy (last visited Dec. 7, 1998)
<http://www.truste.org/users/users_how.html>.
A Framework for Privacy Protection
Page 59
desire to use websites with the mark.170 The aforementioned minimum privacy practices
are advertised to the Internet public, as well as the fact that if any TRUSTe member
website is in breach of their declared privacy policies, the user should contact TRUSTe to
seek action against the website’s publishing organization.171
TRUSTe Strengths
A few key strengths exist in TRUSTe’s proposed plan of self-regulation and
manipulation of World Wide Web norms. First, the fact that adoption of TRUSTe is
technology independent and does preclude other technological solutions.
Anonymization, encryption, and similar technological solutions can coexist with
TRUSTe. In fact, P3P provides the possibility for "assuring parties" to mark the privacy
practices of a particular website to boost users' faith in the value of their contracts. 172 A
second advantage is that, unlike P3P, no new technology is needed to be used by the
websites and users. Third, as TRUSTe wishes to maintain the credibility of its mark,
TRUSTe will engage in extra effort to audit the TRUSTe websites and ensure that they
abide by the rules and guidelines. Finally, TRUSTe claims that it will negotiate the
settlement of any breaches of stated policy between a website and user, providing the
170
See id. ("The trustmark is awarded only to sites that adhere to established privacy principles and agree to
comply with ongoing TRUSTe oversight and consumer resolution procedures.").
171
See TRUSTe, The TRUSTe Watchdog (last visited Dec. 7, 1998)
<http://www.truste.org/users/users_watchdog.html>.
172
See W3C, The Platform for Privacy Preferences (last visited Dec. 7, 1998)
<http://www.w3.org/TR/1998/NOTE-P3P-CACM-19980731/> ("The P3P proposal may also contain other
information including the name of an assuring party and disclosures related to data access and retention ....
The assuring party is the organization that attests that the service will abide by its proposal. Assurance may
come from the service provider or an independent assuring party that may take legal or other punitive
action against the service provider if it violates an agreement. TRUSTe (discussed elsewhere in this issue)
is one such organization that has provided Website privacy assurances, even before the development of
P3P.")
A Framework for Privacy Protection
Page 60
user a path of recourse if a website does not adhere to its promises. A user might
otherwise have no easy remedy against a website who had broken its privacy promises.
Problems
However, TRUSTe has many problems, current and potential, that are impeding
its effect on the privacy norms of the Internet. For one, it is designed primarily as a
solution for businesses to self-regulate and thereby ward off governmental
intervention,173 and as such, many of the fundamental principles of TRUSTe do not apply
well to small web publishers. TRUSTe’s proposed solution simply does not make it
simple and useful for a private individual or small web publisher to register as a TRUSTe
website. In fact, TRUSTe may even raise possible anti-trust issues, as TRUSTe's privacy
requirements may act as a barrier to entry in the e-commerce market for smaller
companies.174 An additional detriment to smaller organizations trying to adopt TRUSTe
is that it increases the liability of the company. A company would not necessarily wish to
subject itself to possibly having to pay fines via TRUSTe for a breach of privacy for
actions that, but for its adoption of the TRUSTe mark, are perfectly legal and without
penalty.175
Another serious problem is that the TRUSTe mark does not mean anything other
than that the website adheres to its stated policies – it is not an endorsement of the
173
See TRUSTe, supra note 162 ("The electronic commerce industry stands to miss out on billions of
dollars in revenue if Websites don't address consumer privacy.").
174
See Reagle, supra note 159.
175
See id.
A Framework for Privacy Protection
Page 61
policies themselves, and the mark itself does not alert the user as to what those policies
might be.176 There is the potential for confusion among users, who might easily
understand the mark to signal the presence of a certain level of privacy protection, and
who might therefore willingly hand over information to a site with practices abhorrent to
the user. When TRUSTe was first launched, it issued three different types of marks
signaling associations to standardized TRUSTe privacy policies. However, many
websites did not like the codifying scheme, and it was switched to a single mark.177
For TRUSTe to be effective, the user must examine the privacy policy of every
website visited and determine, case-by-case, whether each website's practices are
tolerable. This process is cumbersome and complicated, and most users will not want to
be bothered to do it (though such manual checking and rechecking would be greatly
reduced through the use of a technology such as P3P). Also, TRUSTe does not allow for
any negotiation – users must either accept a site's policies or leave the site. Again, the
addition of a technological solution such as P3P would provide some help in this regard.
176
For instance, Classifieds2000 bears the TRUSTe mark, and describes its privacy policy as:
Occasionally we ask you to provide us information such as your name, e-mail address, or
what your interests and hobbies are. You never have to answer these questions although
this may limit what services we can offer you .... We use a feature of your browser called
a "cookie" to assign a "User ID." Cookies, by themselves, cannot be used to find out the
identity of any user. Your Classifieds2000 User ID automatically identifies your
computer - but not you - to our servers when you visit our site. Unless you specifically
tell us, Classifieds2000 will never know who you are, even though we assign your
computer a cookie. Other companies which place advertising on our site also have the
ability to assign a different cookie to you in a process that Classifieds2000 does not
control. But since cookies aren't useable for identifying individuals (you'll be just a
number to them), these advertisers will never know who you are - and Classifieds2000
will never tell them!
Classifieds2000 (last visited Dec. 9, 1998) < http://www.classifieds2000.com/cgicls/display.exe?c2k+na_PrivacyPolicy#gathered>.
177
Telephone Interview by Christian Baekkelund with Joseph M. Reagle, Jr., Resident Fellow, Berkman
Center for Internet & Society, Harvard Law School (Dec. 3, 1998).
A Framework for Privacy Protection
Page 62
Though TRUSTe is still largely unknown to Internet users, it recently signed on
some very visible sites. Many of the large commercial Internet sites have joined
TRUSTe, including America OnLine, IBM, Yahoo!, CNET, and Infoseek. In October
1998, TRUSTe started a special children’s related trustmark, and further dealings with
the Federal Trade Commission helped TRUSTe reaffirm its significance.178 These steps,
however, are relatively small and simply not enough. Of the many millions of Internet
sites that collect information about their users, only 215 are currently TRUSTe
certified.179
It is interesting to note that a very similar approach to TRUSTe’s is about to be
launched in early 1999 by the Better Business Bureau (BBB). The BBB is attempting to
accomplish the same goals as TRUSTe with its own mark, the BBBOnline Privacy
Seal.180 The BBBOnline programs have almost all the requirements and stipulations of
TRUSTe, with a few more base level of privacy requirements to join, such that the user
has a little bit more reassurance upon seeing the Privacy Seal.181 BBBOnline, however,
will probably be subject to many of the faults of TRUSTe’s attempt at shifting public
norms regarding privacy on the Internet.
The future of TRUSTe is bleak. Plagued with numerous problems that inhibit its
178
See TRUSTe, supra note 163.
179
See TRUSTe, Look Up a Company, (last visited December 9, 1998)
<http://www.truste.org/users/users_lookup.html> (listing all TRUSTe certified web sites).
180
See BBBOnline, Online Privacy Self-Regulation Program (last visited Dec. 8, 1998)
<http://www.bbbonline.org/privacy/fr_bd2_1.html>.
181
See BBBOnline, The BBBOnline Privacy Program (last visited Dec. 8, 1998)
<http://www.bbbonline.org/privacy/index.html>.
A Framework for Privacy Protection
Page 63
mass adoption, it is not likely that TRUSTe will have the penetration that is necessary to
shift social norms and effect personal privacy on the web. Ultimately, both TRUSTe and
P3P suffer from the same problem – the lack of incentive by website owners to adopt
strict privacy policies. Both are measures to improve contracting – P3P lowers
transaction costs while TRUSTe distributes information – but, as was stated above, web
users currently have no assets to contract with. Any information they can offer to a site is
generally available to that site without a contract at all, and the web's exponential growth
demonstrates that, despite fears about lack of privacy protections, users are willing to risk
their personal information for Internet access. Until this state of affairs has changed,
there is no reason to expect any proposal that depends on industry initiative, will be
widely accepted.
A Framework for Privacy Protection
Page 64
PROPOSAL
As discussed in the previous section, the current privacy protection framework
does not protect individual privacy. In this section we outline a radically different privacy
framework where individuals own their personal information and data collectors must
contract with users for access to it. We impose sanctions on most uses of an individual's
personal information not explicitly authorized by the individual and we suggest a
technical solution to the high transaction costs of getting authorization.
ENTITLEMENT TO INDIVIDUALS
We start with the assertion that the current set of entitlements is wrong, from a
distributional and efficiency standpoint.182 From distributional and moral standpoint, we
do not believe that there is any justification for allowing personal information to "belong"
to a commercial entity merely because that entity has access to it – such a rule, in fact, is
analogous to a real-space rule that allows anyone to break into an apartment and steal the
television set so long as the homeowner left the door unlocked. Though, given the ease
of locking one's door, we might have little sympathy for the homeowner careless enough
to fail to take precautions, this does not mean that we have handed over the property
rights in the homeowner's possessions to all passers-by. The case for refusing to hand
over property rights to information gatherers in cyberspace is even stronger, as, given
today's technology, it is virtually impossible for a user to "lock" his or her doors against
182
See Guido Calabresi and A. Douglas Melamed, Property Rules, Liability Rules, and Inalienability: One
View of the Cathedral, 85 HARV. L. REV. 1089 (1972) (explaining the use of distributional and efficiency
considerations for allocating entitlements).
A Framework for Privacy Protection
Page 65
invasion in the first place. There are also unpleasant distributional effects to awarding
commercial websites, rather than individuals, the right to gather information – users, who
may be composed of individuals of widely varying economic resources, may, as a group,
be less able to afford to make the purchase than would be commercial entities which are,
by and large, a better organized and wealthier group.
From an efficiency standpoint, the current set of entitlements is unsatisfactory, for
all of the reasons set forth in the previous section. If users are too disparate and
widespread a group to effectively coordinate with each other, they cannot easily buy back
their information even if they desire to do so and even if such an ultimate allocation of
resources would be efficient. Industry, with its superior resources, organization, and
information, can most easily initiate requests, determine appropriate "price levels," and
generally prevent the harm that comes from unconstrained use of personal information. It
is therefore better to grant the entitlement to the user and thus force industry to "reveal"
its superior information through the process of contracting.
We propose, therefore, a new legal framework awarding the entitlement of
information to the individual, rather than the current system which awards the entitlement
to the party which can build the better locks.183 This framework would set the default
rule at privacy for users, with three basic exceptions.
183
Cf Kang, supra note 90, at 1245 ("[R]elying upon technologies alone may have unfavorable
distributional consequences, which favor the computer savvy and well-educated.").
A Framework for Privacy Protection
Page 66
THREE EXCEPTIONS:
Operational and Aggregated Data
The first exception would be to allow websites to collect that information that is
functionally necessary for the communication to occur (in the form of TUIDs184) – for
instance, the user's IP address for the time it takes to transmit data. The second exception
would allow the collection of information held in the aggregate (which would be
recorded in the history of TUIDs and also transferred in the form of PUIDs185 if the user
so desires). Because websites often want information about the demographic
characteristics of their own users, and because such information represents no privacy
threat to individuals so long as all data that would connect the information to a particular
person is erased, websites will be permitted to gather this information without users'
permission.186 However, websites would bear the burden of proving that use for
aggregation was not individually identifiable.
Opt-In by Contract
The final exception would be to allow parties to "opt in" to a less-restrictive
system through the use of contract. Our new standard would require that to gather
information beyond the exceptions mentioned above, websites first gain the explicit
184
See the P3P Case Study, in Technology Section, for more information on TUIDs.
185
See the P3P Case Study, in Technology Section, for more information on PUIDs.
186
We would recommend that, as a penalty for misuse of data, websites be subject to a statutorilydetermined minimum penalty, either in the form of a fine or in the form of liability in tort. We also
recommend the creation of an agency to investigate complaints of privacy violations, as a user might only
know that his or her information was compromised, without being aware of which of the many companies
the data was released to was responsible for the violation.
A Framework for Privacy Protection
Page 67
permission of an individual. Further, if the website wants to do more than simply use the
data internally – that is, if the website wishes to sell the data to other businesses –
permission would also need to be obtained.187
AUTONOMOUS COMPUTER CONTRACTING PRIVACY TECHNOLOGY (ACCPT)
The granting of such permission on a case-by-case basis, though legal, would, of
course, be extremely unwieldy. Therefore, given this new legal regime forbidding
websites to collect information without permission, a technological and legal solution
must now be created to allow websites to easily contract into an agreement with a user as
to the privacy policy to be used. We call our proposed guidelines for these agreements
the Autonomous Computer Contracting Privacy Technology (ACCPT). The Platform for
Privacy Preferences (P3P) proposal being developed at the W3C provides an excellent
basis for what ACCPT is to try to accomplish and is adopted as the sample technological
foundation for ACCPT; the technological structure of ACCPT is simply to be a P3P
framework extended and revised with an eye to the legal concerns inherent in agency
contracting in order to ensure fair and enforceable contracts. It is worth noting at the
outset that for ACCPT to be adopted and work successfully, it must be both very simple
for the none-too-savvy user, as well as have a large range of configureability for the
power user. This range of ease of use and flexibility will hopefully come about in the
research and development of user agents by businesses, independent organizations, and
187
A third exception might be necessary to allow websites to ascertain the real-space location of the user
employing modified TUIDs, with this record of location ultimately being erased unless the user later
explicitly agrees to its storage. See infra text accompanying notes 193-196.
A Framework for Privacy Protection
Page 68
private individuals, as is the intended goal with P3P.188
One of the major difficulties, from a legal standpoint, of allowing computerized
agents to contract away real users' privacy is the possibility of a disjunction between a
user's preferences and the "deals" made by the agent. Where there is no meaningful
assent on the part of the user, there is no contract.189 With no contract, under our
proposal, the website is potentially in violation of the law for having used the information
without permission. In order, then, to make sure that ACCPT transactions are legally
binding – and that they provide protection from liability for websites – we propose that
all ACCPT programs operate with a certain degree of transparency. Novice users may
not wish to delve into the bowels of the program in order to set their particular
preferences, and default settings may be devised for them; however, the ACCPT program
must require that the user affirmatively assent to basic aspects of these default profiles,
using clear and easily-understood explanations of what the settings actually mean.190 It is
only by this sort of transparency that a user can be said to have truly assented to the
contract formed by the agent.
Websites negotiating with agents identifying themselves as ACCPT compliant
could, therefore, rely on those contracts as representing the will of the user, whereas they
would enter into contracts with non-ACCPT technology at their peril. If the ACCPT
agent employed by a particular user then turned out to be faulty in some way – say it was
188
Interview by Christian Baekkelund with Daniel Weitzner (Dec. 8, 1998).
189
See Restatement (Second) of Contracts § 21.
190
A less technical and more user-friendly readable version of the harmonized vocabulary would be widely
distributed to promote user understanding and awareness of terms.
A Framework for Privacy Protection
Page 69
not transparent about its defaults, or did not properly "follow" user instructions – liability
for the resulting privacy violations would lie on the maker of the faulty ACCPT
technology, using a negligence standard. In this way, websites would have an incentive
only to bargain with agents that truly do express the preferences of the user, while
programmers would have an incentive to ensure that their products are accessible, and
comprehensible, to less-skilled consumers.
P3P contains a number of beneficial characteristics that it would be wise for
ACCPT to adopt.191 ACCPT utilizes P3P's entire flexible negotiation between user and
service to contract as to privacy practices. As it is designed for P3P, this flexible
negotiation, with possibilities for multiple proposals sent by the service and counterproposals sent by the user, allows for a good and appropriate agreement to be readily and
easily found. P3P's adoption of already defined and used technological standards such as
HTTP, XML, and RDF is another benefit worth noting; it is important to use popular
standards to promote the adoption of ACCPT solutions by businesses. The use of a
harmonized vocabulary is an advantageous step towards universal acceptance and
understanding of P3P, and ACCPT proposes to use the same technique, and in fact retain
the original vocabulary.192 Additionally, the open and non-proprietary nature of the
development of P3P by the W3C is another important point to maintain in ACCPT, for
not only does open development engage well with the developmental nature of the
Internet, it helps foster better and more secure technologies as well as further enable and
191
See the P3P Case Study, in the Technology Section, for more information on the following listed
benefits of P3P.
192
See W3C, supra note 61.
A Framework for Privacy Protection
Page 70
encourage market adoption.
The actual technical differences between the adopted ACCPT base protocol, P3P,
and ACCPT itself are small, but important. First, as stated above, ACCPT requires that a
certain degree of transparency in its operation to ensure that users do not object to the
activities of the agent . Second, ACCPT requires that the individual have some basic
digital certification verifying at least key characteristics such as age and relevant
geographical location.193 This is necessary because ACCPT envisions that sovereigns –
national governments, local governments, and even parents – may wish to restrict an
individual's ability to reveal certain pieces of information. In an effort to enable this
possibility, ACCPT will transmit in the TUID of an ACCPT negotiation the location of
the user making the negotiation as it is represented in their digital certificate.194 Upon the
beginning of a P3P negotiation, the server would collect the profiles from the necessary
sources according to the location of the user. For example, if a user in Pompano Beach,
Florida, makes a request from an ACCPT server, the server may be required to retrieve
the privacy requirements of the city of Pompano Beach, Broward county, Florida state,
and the United States before continuing with the negotiation with the user’s agent.
Which profiles the server must retrieve would be dependent upon the location of the user
193
It is important to note that how individuals attain these certificates and how the certificates are assuredly
true is beyond the scope of our proposal. Our proposal is simply based in a setting where this is the case.
The server would make use of a language to be proposed similar to P3P’s APPEL (A P3P Preference
Exchange Language). For more information on APPEL, see W3C, A P3P Preference Exchange Language
(last visited Dec, 8, 1998) <http://www.w3.org/TR/WD-P3P-preferences>. APPEL is to be used in P3P to
exchange privacy preference profiles between two entities and a language similar to, but more secure than,
APPEL would be used in ACCPT to exchange privacy profiles in a similar manner. This APPEL-similar
tool will also be helpful for users that desire to use the privacy practices of a certain organization, such as
their employer or a civil rights group, for they need merely set up their agent to pre-fetch these privacy
preferences.
194
A Framework for Privacy Protection
Page 71
and applicable laws.195 These profiles would then be used in conjunction with the
server’s privacy desires in the continuation of the ACCPT negotiation. The actual
ACCPT negotiation would then proceed in the same method as it would under P3P.196
Ultimately, if the user failed to authorize the release of the geographical (or age-related)
data, the website would be required to erase it from its files.
ACCPT would also adopt many of the optional P3P related features. To enable
the aggregate collection of information by websites even in the event of failed
negotiations, the use of PUIDs and TUIDs as described in P3P will be used by ACCPT in
the manner mentioned earlier in the proposal. The creators of user agents will be heavily
recommended to make use of propIDs and PUIDs as well as the necessary TUIDs; the
use of a storage of a base set of information on the user on the user’s computer such as is
possible with P3P will be encouraged as well.197 The use of propIDs will increase
accountability of past formed contracts as well as provide a network to the overall
structure, and the use of PUIDs combined with data repositories on the users computers
will be very beneficial to businesses due to the desire to collect data on user preferences,
in a manner to somewhat similar to, but as a replacement for, cookies.198 Also, in
We do not attempt, for the purposes of this paper, to determine which laws those would be – rather, we
just assume that some set of laws applies, and we propose that ACCPT be capable of enabling them. We
further do not make geography based checking of the collective choices of a community a requirement for
the current proposal. Rather we leave the ability for communities to require such a mechanism.
195
196
Servers maintaining the privacy requirements for a locale must be created and maintained by the
appropriate organizations and lists of such servers must be publicly and easily accessed for server
administrators to use to update their server configuration with. In the creation and adoption of ACCPT, it
will be a necessity to help foster the creation and knowledge of these servers. With a properly designed
ACCPT system, a cache of such pre-fetches should be created and used frequently to minimize the network
traffic.
197
See the P3P Case Study, in the Technology Section, for information on data repositories and vCards.
198
Interview by Christian Baekkelund with Daniel Weitzner (Dec. 8, 1998).
A Framework for Privacy Protection
Page 72
ACCPT, all propIDs would be digitally signed upon creation to "provide irrefutable
evidence of an agreement."199 Unlike P3P, there would be no fear on the part of the user
of unenforceability of the web agreement, for with ACCPT, the trust would lie in the fact
that the user knows there are hard legal backings to the ACCPT agreement.
199
See W3C, supra note 50.
A Framework for Privacy Protection
Page 73
CRITIQUE
Although the legal and technical framework that we propose as ACCPT has many
advantages and benefits, in this section we identify and discuss some of the potential
problems with the proposal.
TECHNICAL AND LEGAL COMPLEXITY ADDED BY ACCPT
Technical Complexity for the Programmer
First of all, although relatively simple, ACCPT does add an additional level of
complexity. On the technical side, for example, ACCPT adds considerable technical
complexity to the HTTP protocol. Unfortunately, the additional technical complexity
could lock out smaller programmers, which, if history is any indication, would probably
slow down the rate of development and advancing innovations of the web. Right now,
the web is powered by software, written by smaller programmers and academics, and
released for free public use and modification. According to a December 1998 Netcraft
survey, over half of all web servers are powered by Apache, a free web server written by
an ad-hoc group of volunteers.200
Complicated protocols can lock out small programming teams from being able to
innovate. This might mean that large corporations will dominate the ACCPT compliant
agent software market. As the infamous Halloween Memo, an internal Microsoft
200
See Netcraft, The Netcraft Web Server Survey (visited Dec. 9, 1998) <http://www.netcraft.com/survey/>.
Microsoft's own Hotmail service runs on Apache, on top of the FreeBSD operating system (developed in
the same fashion), despite Microsoft's repeated attempts to move to NT/Microsoft-IIS. See id.
A Framework for Privacy Protection
Page 74
document, stated:
OSS [free software] projects have been able to gain a foothold in many
server applications because of the wide utility of highly commoditized,
simple protocols. By extending these protocols and developing new
protocols, we can deny OSS projects entry into the market. 201
ACCPT is an extension of the HTTP protocol and one with complex legal issues
at that. Currently HTTP is trivial to implement. Even law student hackers can write
HTTP clients without much trouble, particularly with the wealth of free software libraries
available to help.202 Incorporating ACCPT will be more complicated but we expect that
similar libraries would be developed following the ACCPT guidelines and that the extra
complexity of the ACCPT protocol would eventually be masked from most programmers.
Technical Complexity for End-Users
In addition to the added intricacy on the programming end, ACCPT agents are
will require installation, either as part of a browser or separately. Users may be reluctant
to set (or even read and accept) their ACCPT preferences.
In counterpoint, however, it would only be fair to recognize that the ACCPT
proposal only requires certain settings for installations in order for a website to be certain
that a computerized agent is authorized by the user. The installation of most computer
programs requires an initial time commitment to set up preferences and the like.
Moreover, the protocol could make the process of installing the browser as easy and
201
Open Source, {Halloween I -- 1.10}: Open Source Software (visited Dec. 9, 1998)
<http://www.opensource.org/halloween1.html>.
202
See e.g. LIBWWW - PERL - 5 (last visited December 9, 1998) <http://www.cpan.org/modules/bymodule/Bundle/libwww-perl-5.41.readme> (describing the World Wide Web functions PERL library
which implements many HTTP methods).
A Framework for Privacy Protection
Page 75
efficient as possible. Just as the additional technology required by ACCPT can add
complications, technology can also eliminate those complications, as we expect it will.
Legal Complexity for Programmers
ACCPT places legal liability for buggy software on small programmers. Although
the legal framework is not excessively complicated, it may still discourage small
programmers from entering the space for fear of suit. The use of a "negligence standard"
may mitigate that effect but application of that standard is unlikely to be easily
understood by programmers. The negligence standard is not a standard that is delineated
by law or statute, but one that is given its force by the courts, as cases and controversies
with concrete facts are decided. Thus, it is really only as problems come up that the
standard is fleshed out. Given the importance of the free software movement, it is
possible that such an ambiguous standard, with the accompanying penalties, will be
highly discouraging to programmers. It is unlikely that they are willing to be guinea pigs
for the legal system when they often receive little benefit for making their programs
available to the public gratis.
However, such an argument ignores the fact that the ACCPT legal framework
proposed is actually simpler than the legal framework that now exists. The current legal
framework is a patchwork of industry-specific statutes and difficult-to-follow common
law which varies by state.203 The main innovation of ACCPT is the "flipping" of the
entitlement to information. Rather than the automatic disclosure of information, with the
burden on the user of the site to negotiate with the site in order to regain his or her
203
See supra Law Section.
A Framework for Privacy Protection
Page 76
information, ACCPT would help the user to "own" his or her information, requiring the
site to negotiate with the user for access to the information. Rather than adding a layer of
legal complexity, ACCPT may be better analogized to an inversion of the current
framework. Moreover, the complexity of the "legalese" can be minimized by those who
write the law. Given the alternative mosaic of common law rights and industry specific
statutes, of which it is unlikely that programmers are any more aware, the more
comprehensive and general legal backdrop proposed by this paper is preferable.
Legal Complexity for End-Users
Aside from the legal complexity presented by ACCPT to programmers, there may
also be residual effects on end-users. Under ACCPT, users are allowed to customize
privacy preferences in extreme detail. Upon installation, the user may feel overwhelmed
by "legalese," and may be discouraged from participating in on-line transactions
altogether. In an analysis of P3P, this same problem was cited. Citibank expressed their
opinion that:
>From a consumer standpoint using P3P may be quite confusing, as the
user may feel inundated with "legalese" and too many choices. . . . The
user may feel inundated with too many choices about too many pieces of
information when configuring P3P. If the user does not choose to fill in
the information beforehand, there will be a point when the user is
transacting with a website operator, that he/she would need to fill in that
information. The user would then have to decide under what sort of
conditions he/she would like to have personal information given out. . . .
Nevertheless, there is a possibility that P3P will discourage neophytes
from transacting or interacting online altogether. 204
In reality, we believe that proper user interface implementation can avert this
204
LEE & SPEYER, supra note 65.
A Framework for Privacy Protection
Page 77
problem. Most browsers will probably lump privacy settings into large, easily
understandable groups, and only advanced users will engage in further customizations.
Furthermore, the facilitation of collective privacy choices through the APPEL-like
functionality of a well designed ACCPT agent will allow users to use settings constructed
by groups instead laboriously constructing their own.
Public Access
Also important to note is that fact that the technical complexity added by ACCPT
may have unintended secondary effects on public access. For instance, many schools and
libraries have public access web terminals. It would be unwieldy for each user to
customize the ACCPT preferences on these machines. At the same time, without
ACCPT, users may not be able to view a potentially large part of the content on the
Internet. ACCPT may make be sacrificing some amount of public access for the sake of
flexibility.
Services for the remote storage of ACCPT profiles should arise to solve this
problem in much the same way Yahoo currently stores many people’s calendars.205 This
would again make use of our proposal for APPEL-like language for transferring privacy
preferences.
A Slower Internet
An additional problem with the use of ACCPT is the network traffic that it would
create. Due to the necessary negotiation that must occur before a user can access data
205
See e.g. Yahoo Calendar (last visited December 8, 1998) <http://calendar.yahoo.com/>.
A Framework for Privacy Protection
Page 78
from a website, more data must be transmitted back and forth to a website before a user
can gain access to it creating both a noticeable delay in the time it takes to initially
receive information from a website, as well as an over all increase in the amount of data
that it will create in general. This overall increase in data transmission will in fact slow
down other users on the Internet as well especially at points of low bandwidth. This
problem would be further exacerbated by ACCPT's use of regional profiles to enforce
legal standards due to the even further elevated level of necessary network traffic. This
problem, however, may become moot in the future as the average Internet user's
bandwidth increases at the same rate it has in the past.
LIMITATIONS OF ACCPT
While the above criticisms constitute commentary about the problems with the
implementation of our proposal for ACCPT, legitimate objections may be made in that
perhaps ACCPT does not go quite "far enough." Though ACCPT has addressed many of
the problems in government policy towards the protection of information, other areas of
concern also exist which we have not fully addressed.
International Conflict of Laws
A major limitation of our proposal is its domestic nature, when the problem with
information privacy is a global one. Most likely, not all nations will pass laws requiring
or even supporting ACCPT. Since the ACCPT proposal does not require its international
adoption – though we would welcome it –international websites could lie about the level
of privacy they provide without there being an easy recourse for consumers. Some
A Framework for Privacy Protection
Page 79
nations could provide safe haven for websites that wished to collect information without
accepting limitations on its use. 206 As long as other countries can break ACCPT
contracts without repercussions, then the proposal may be adding very little in the way of
protection of privacy.207 The U.S. government can "forbid" users to contract away data
without particular guarantees if it so chooses, but if the other end of the party breaks the
deal, there is no recourse. Some argue that a completely technical solution may be the
only way to deal with this problem.
However, a "complete technical solution" cannot be the only answer to the
question. As with our proposal, the problem is that technology doesn't exist in a vacuum
-- there must be legal standards in place to encourage the adoption of the technology.
This is particularly true on the international level, where other countries might actively
pass laws preventing its adoption. On the other side, this is also a question of sovereignty
– treaties, conflicts of law principles, and the like step into the gaps, just as in "real
world" conflicts.208 Granted, technology may provide a partial solution. If the United
States adopts a code-based solution, it is likely that Europe will follow. Since packets go
through several routers in the U.S., it will guarantee at least partial privacy for U.S.
citizens.
If TCP/IP is modified in such a way that websites cannot collect IP addresses,
electronic payment is completely anonymous and shipping is completely anonymous, no
206
See Sovereignty Presentation for Cyberspace and Social Protocols, Harvard Law School (Nov. 24,
1998).
207
208
See id.
See generally, Jack Goldsmith, Against Cyberanarchy (visited Dec. 8, 1998) <http://wwwswiss.ai.mit.edu/6805/articles/goldsmith-against-cyberanarchy.html>.
A Framework for Privacy Protection
Page 80
companies could collect personal information. If anonymizing routers were implemented
as part of IPv6, countries would have to adopt it to avoid being cut off from the rest of the
Internet. Regardless, ACCPT does not preclude the additional protection of a code-based
framework to protect privacy.
Profiling
Another criticism of ACCPT is that it does not address the collection of "public"
information into detailed profiles. Currently, much information is available for legal
collection. There is no current U.S. provision,209 nor is there any provision in ACCPT, to
prevent the selling of this information. An individual or organization is legally allowed
to collect information from the United States postal system, phone directories, tax
information, family lineages/geneologies, real estate records, as well as collecting all
posts, web pages, or any other on-line "traces" made by individuals.
This problem is one that should definitely be considered by the legislature and
courts. Even if ACCPT becomes widely accepted, it is indisputable that such endeavors
would be still technically be legal, yet may be a more serious violation of privacy than a
website keeping a record of IP addresses. Surely ACCPT will slow the collection and use
of this data as ownership would have to be established and the information collected
without any help from the individual being profiled. However, ACCPT does not retrieve
information currently public nor regulate the collection of information about users in the
real world.210
209
See supra Law.
210
For example, what if Alice discloses information, yet decides she wants it back 15 years later? Perhaps
A Framework for Privacy Protection
Page 81
Operational Data
Another problem with a legal structure that prevents collecting any information
without permission is that of data that needs to be collected for normal server operations.
Although the law includes provisions the collection of operational data for use in
fulfilling transactions (IP addresses used to send requested pages), frequently, servers
need or want to collect additional operational data for security purposes. Currently, the
operational use exception does not include security use under our proposal. Indeed what
would constitute a legitimate security use is unclear. Presumably, an officer with a
warrant could obtain the operational logs in spite of a user’s privacy interest in them.
However, though there was significant discussion over the tradeoff between privacy and
accountability, our proposal errs on the side of privacy and does not provide a specific
security use exception.
If introduced, anonymous browsing, shipping, and payment technologies may
make a potential security use exception much less important as much operational data
could be revealed with minimal privacy loss. Our proposal encourages further work in
these areas.
as a 21-year-old college student, she traded some information when ordering a rubber chicken to save 50
cents. Now she is 39, has discovered cold fusion, is under a lot of media attention, and doesn't want it to
become public that she has a chicken fetish. There is some flexibility here in an ACCPT negotiation where
a site might state as a privacy practice that it deletes all records after 2 years or some other similar period of
time. However, without law mandating such a requirement, it is hard for a user to require it. Although it is
easy for a website to state something, and a user to accept it or not, under ACCPT, it is more difficult for
the user to make a counter-proposal with certain stipulations. The website may not even recognize some of
the user's proposals/requests.
A Framework for Privacy Protection
Page 82
"Race to the Bottom"
Finally, a potential problem unaddressed by our proposal is that ACCPT may not
result in a diversity of privacy policies if various websites collude. This may become
especially problematic if the websites only allow access to their pages upon the
disclosure of information; if a "floor" develops, such that all pages require a minimum of
information, then the privacy of users may not be significantly protected by ACCPT.
This is an issue which we recommend government closely monitor. In the event that a
floor on information does develop, Congress may reanalyze the issue, perhaps
distinguishing between information which can be exchanged and information that is
inalienable.
PHILOSOPHICAL CRITICISMS OF ACCPT
In addition to the various criticisms thus far discussed, there are several debates
which occur at a more basic level. ACCPT is based upon the acceptance of several
assumptions and basic beliefs about privacy.211 Yet, there are various other frameworks
within which alternative proposals could be suggested.
No Data Collection
An even more extreme view is that there is no need at all for data collection.
Despite the initial impressions, there are a fair number of arguments made in its favor.
First, while data collection may bring in a significant amount of revenues, e-commerce
211
See supra Introduction.
A Framework for Privacy Protection
Page 83
may actually increase with a prohibition against collection of data.212 Second, the
argument maintains that most respectable websites do not engage in privacy violations;
thus, the only sites it would potentially harm provide minimal positive value to society.213
Moreover, larger revenues do not necessarily lead to good content. Many websites with
quality content are created by people interested in more than money, such as public
interest organizations,214 musicians,215 academics,216 etc., more so than content from
212
The FTC states that:
[Although] the online marketplace is growing rapidly, there are also indications that
consumers are wary of participating in it. Surveys have shown that increasing numbers of
consumers are concerned about how their personal information is used in the electronic
marketplace. . . . In fact, a substantial number of online consumers would rather forego
information or products available through the Web than provide a Web site personal
information without knowing what the site's information practices are. According to the
results of a March 1998 Business Week survey, consumers not currently using the
Internet ranked concerns about the privacy of their personal information and
communications as the top reason they have stayed off the Internet. . . . These findings
suggest that consumers will continue to distrust online companies and will remain wary
of engaging in electronic commerce until meaningful and effective consumer privacy
protections are implemented in the online marketplace. If such protections are not
implemented, the online marketplace will fail to reach its full potential.
FTC, supra note 6.
213
The Direct Marketing Association takes issue with this assertion. It states :
A Website that provides no opportunity for interaction is not interesting to consumers,
adds little or no value to their dialogue with marketers, and therefore is of little value to
businesses. Some may need to simply track the number of persons who are visiting their
sites. Others may encourage visitors to ask questions as a way of improving their services
or building their customer base. Others may provide a product for free that they charge
for in the traditional world. In return, it seems reasonable for a publisher to ask a Website
visitor to register before providing that person with the same material that could cost him
on a newsstand.
Direct Marketing Association, FTC: Consumer Privacy Comments Concerning The Direct Marketing
Association--P954807 (July 16, 1997 )
<http://www.ftc.gov/bcp/privacy/wkshp97/comments2/dma027a.htm>
214
For example, much of the research for this paper was conducted on the web. Many cites in this paper
were pulled from public interest organizations such as the Electronic Frontier Foundation, which has also
been ranked as one of the four most-linked-to sites by Webcrawler. See Electronic Frontier Foundation
(last visited Dec. 9, 1998) <http://www.eff.org/>.
215
For instance, Lycos lists among its Top 5% Websites The Recorder Home Page. See Lycos, Best of
Web (visited Dec. 9, 1998) <http://point.lycos.com/index.html>. Although the creator never says he is a
recorder player himself, he provides a non-commercial site containing an incredibly detailed historical
A Framework for Privacy Protection
Page 84
overcommercialized sites. Rather than simply encouraging growth and more content on
the Web, perhaps we should be encouraging better quality or discouraging the minimally
beneficial websites. Finally, marketing and data collection uses resources, but
contributes little of substance back to society. In the long run, if human resources are
reallocated from data collection or marketing to productive work, the output and average
standard of living for society as a whole may rise. Although in the short run, this may
hurt e-commerce, it may help other industries just as much in the long run.
An Internet where data collection was prohibited is some people’s view of a
Utopian cyberspace. However, our proposal took a different stance because we believe
that the technology is not mature enough for this to be practical goal. Furthermore, we
recognize that different people have different privacy concerns and would be best served
by different privacy policies. Thus in fulfilling our goals of protecting individual privacy
but also enabling electronic commerce and encouraging a diversity of privacy policies,
we took a more moderate approach.
Less Regulation
On the opposite side of the spectrum, there are advocates for the philosophy that
there should be little to no regulation of data collection and storage. 217 The argument
background, extensive list of links, and a discussion of the cultural role of the recorder in all its forms, from
medieval pipe to the contemporary plastic kid's noisemaker. See Nicholas S. Lander, The Recorder Home
Page (last updated Nov. 30, 1998) <http://iinet.net.au/~nickl/recorder.html>.
216
Lycos also lists multiple university libraries that provide on-line access to their resources. See Lycos,
supra note 212. Moreover, many academics will public their books and articles on-line before "paper"
publication. See, e.g., Simson Garfinkel, 2048: Privacy, Security and Community in the Next Century
(visited Dec. 9, 1998) <http://simson.net/2048/>.
217
See OSCAR H. GANDY, JR., THE PANOPTIC SORT: A POLITICAL ECONOMY OF PERSONAL INFORMATION
A Framework for Privacy Protection
Page 85
assumes that as long as the information is obtained legally, i.e. no trespass or fraud is
involved, then the user has no more power over what is done with that information. The
basic assumption underlying this theory is about entitlements – if a website has the
technological ability to obtain information about the user, then the website owns that
information.
For the advocates of this argument, the force comes from the First Amendment.
In support of the theory that collectors of legally obtained data can do whatever they like
with that data, some have argued that the primary constitutional issue is free speech,
rather than privacy; once information is freely released, any restrictions on corporate
speech, such as customer lists, should be disfavored.218 However, the ACCPT proposal
does not present an absolute ban on the speech. Instead, it suggests that the government
implement constitutionally permissible "time, place, and manner restrictions"219 – that the
government simply regulate the manner in which this "speech" is conducted.
More importantly, this seems to be a misleading application of the First
Amendment. Rather than the right to free speech, it seems more closely analogized to the
limited right to appropriate or use other people's speech – in this case, an individual’s
private information. Copyright, trademark, and patents are all examples of instances
where the government has legitimately stepped in to protect the rights of the original
66-68 (1993).
218
219
See id. at 107.
Even if direct marketing were held to use public forums, which have the strictest standards for
regulation, the ACCPT proposal passes strict scrutiny and would be held permissible. See Ward v. Rock
Against Racism, 491 U.S. 781 (1989) (holding that if the regulation is content neutral, narrowly tailored to
serve a significant government interest, and leaves open alternative channels of communication,
government regulation is permissible).
A Framework for Privacy Protection
Page 86
"speaker." Alternatively, ACCPT can be seen as an extension of the government's right
to regulate interstate commerce.220 Given that the most vocal proponents of this structure
of "less regulation" are from the direct marketing camp,221 it is unlikely that they would
deny that interstate commerce is involved.
Moreover, a world with this structure would degenerate into a cat-and-mouse
game, with the user trying to technologically stay one step ahead of commercial websites.
And this is a game that most users of the web are likely to lose, given their lesser
resources and access to technological innovations, as well as problems with collective
action.
220
See e.g., U.S. v. Darby, 312 U.S. 100 (1941) (affirming the federal government's right to regulate
interstate commerce).
221
See GANDY, supra note 217, at 66-68.
A Framework for Privacy Protection
Page 87
CONCLUSION
The introduction of this paper discussed Alfred, the hypothetical web surfer. In
the sections that explained the current technical, legal, market and social norm
frameworks we saw how Alfred’s privacy could be violated in surprising ways. We saw
that Alfred could be tracked by IP address or cookie and information given to one site
could be easily correlated with his name and address given to another. In the market and
social norms section, we saw that the technology to collect, store, analyze and share
Alfred’s information was cheap. We also saw that although people are concerned about
their personal privacy online, Alfred was likely to give up his information anyway.
Finally, in the legal section explained that Alfred’s video rental receipts are likely to
receive more privacy protection than his concerns over prostrate cancer. We discussed
P3P and TRUSTe as potential solutions to Alfred’s privacy problems but saw that the
background legal and technical regimes prohibit the success of both of these potential
solutions.
In response, we proposed the ACCPT legal framework and technical guidelines.
These would ensure that Alfred would be aware of the way in which his private
information might be used before he agreed to divulge it. If Alfred didn’t want his visit to
ProstrateInfo.Com to be logged or given to other entities, he could either come to that
agreement with ProstrateInfo.Com, or find another site that would agree to those terms.
However, as discussed in the critique section, Alfred’s privacy comes at a price.
Companies will need to be more careful about how they handle information. Individuals
A Framework for Privacy Protection
Page 88
will need to install agents and assent to their defaults. Web sites will open themselves to
liability for the using information in ways that currently do not carry any potential
penalty. Computer programmers wanting to write browsers with ACCPT compliant
agents will find it technically more challenging and also open themselves up to potential
liability which is not present under the current framework. Furthermore, our proposal
does not solve thorny internationalization issues or completely eliminate profiling.
Privacy regulation is a balancing act. This proposal strikes that balance in favor of
the individual. ACCPT protects individual privacy while enabling electronic commerce
and providing a new market for information where the players are on more even ground.
ACCPT attempts to enfranchise everyone, not only the rich or tech-savvy. Furthermore,
ACCPT is flexible enough to encourage diversity in implementation and privacy policies
while still allowing collective action.
As the Information Revolution chugs on its course, the United States has
demonstrated an extreme reluctance to offer legislative guidance as to how it should
proceed. Other than pledging protection for its own records, and enacting a few stop-gap
protection measures that correspond to the day's headlines, the federal government has
adhered to its rosy view of industry whereby a few fierce words from an extremely probusiness administration will prompt businesses to forego an extremely easy and
profitable activity. If it was not clear before now that such an approach had little
likelihood of success, the FTC's report to Congress should have been the final nail in the
coffin of the idea of self-regulation.
America's reluctance to take drastic action may not be entirely predicated on the
A Framework for Privacy Protection
Page 89
influence of moneyed interests in Washington. The United States has a very long
tradition of freedom of information, and that norm might be one that legislators are
reluctant to disturb. It is a shocking thing to say in America, "You have access to the
knowledge, but you cannot use it." Though it would not be the first time such a thing
was done – as was mentioned earlier, we have protection for copyrights, for trade secrets,
and a host of other intellectual properties – ultimately what we are proposing to protect
here is information that an individual parts with in the real world on a daily basis, without
any expectation that that such information will be controlled. Our names, our addresses,
our phone numbers, our favorite television shows, our buying habits – we simply are not
used to thinking about these things as "private." When we see, however, the potential
harm caused by the unrestricted flow of this information in vast databanks, it becomes
clear that our traditional ideas about privacy must be radically altered to fit this new
world.
Our proposal is the first step in this process – it is intended to function as a
redefinition of what information is private. This is a context-sensitive redefinition: if the
man at the newspaper stand on the corner remembers that Alice buy the Washington Post
every morning, that is not a privacy violation – but if Altavista "remembers" this
information without her consent when Alice searches for the Post online, or DoubleClick
remembers it because DoubleClick hears through Altavista, or Barnes & Noble
remembers it because Alice followed a link to its website – these are, indeed, violations
of her privacy. The problem certainly extends beyond the web – we are living in an age
where vast stores of data about every person in this country are freely available, by going
A Framework for Privacy Protection
Page 90
down to the courthouse, or a Hall of Records. Without a workable, context-sensitive
definition of privacy, our willingness to make those records available publicly on paper
will translate to a willingness to make those records available publicly in electronic form
– which, because of the ease with which electronic records can be accessed and searched,
may very well have a qualitatively different meaning of "public" than was present in the
case of a public record in a file in a cabinet in the basement of City Hall. Our proposal
lays the groundwork for this new understanding variations of "public" and "private"
depending on who has access to the information, how it is used, and how much control
the individual has over it, but there is still much to be done. We must alter our
understanding of "public" and "private" from how these terms were meant 200 years ago,
when basic decisions about the structure of our legal architecture were made, to what
those terms mean now, when our technological architecture has reached a place that no
one could have envisioned when the rules were created. As the web and use of
computers continues to grow, the importance of this process cannot be underestimated –
and it requires more direction from our government than a simple hope that ultimately,
everything will work itself out. Our proposal has the courage to take a stand and suggest
ways that the government can help to protect personal privacy. We answer Al Gore’s
challenge for an electronic bill of privacy rights – the only question now is whether
government will have the courage to follow.
A Framework for Privacy Protection
Appendix A : Page 1
APPENDIX A: TABLE OF PROPOSED LEGISLATION
DURING THE 105TH CONGRESS222
Protection of Children
H.R. 1972
Children's Privacy Protection and Parental Empowerment Act of 1997
Prohibits "list brokers" from selling, purchasing info about children without written
consent of parent. Requires marketers to give access about child to parent, prohibits using
prison inmate labor to process children's' information. Requires marketers to give lists to
National for Missing and Exploited Children for data matching purposes. The bill has 52
cosponsors.
H.R. 2368
Data Privacy Act of 1997
Interactive computer services may not collect any information about a child or disclose
this information without notifying the child, in advance of collection or use, that the
child should not provide any information without the consent of his or her parent. No
person shall solicit or collect information from a child regarding his or her parents. Part
of a more comprehensively conceived protection.
H.R. 2815
Protecting Children From Internet Predators Act of 1997
Criminalizes sending sexually explicit messages to people under 16, even if sender is
under that age.
H.R. 3494
Child Protection and Sexual Predator Punishment Act of 1998
Criminalizes sending sexual material to a minor. Minimum prison term for using
computer is 3 years. Allows use of subpoenas to obtain evidence instead of warrants.
S. 504
Children's Privacy Protection and Parental Empowerment Act of1997
Prohibits the sale of personal information about children without their parents' consent.
222
All information from EPIC Online Guide to 105th Congress Privacy and Cyber-Liberties Bills (Last
visited Dec. 3, 1998) <http://www.epic.org/privacy/bill_track.html >.
A Framework for Privacy Protection
Appendix A : Page 2
S. 1965
Internet Predator Prevention Act of 1998
Prohibits the publication of identifying information relating to a minor for criminal sexual
purposes.
S.1987
Child Protection and Sexual Predator Punishment Act of 1998
Increases penalties for transmitting obscene materials to minors, contacting minors using
net "for the purpose of engaging in any sexual activity".
Protecting Personal Identification Data
H.R. 98
Consumer Internet Privacy Protection Act of 1997
Interactive computer services may not disclose personally identifiable information
provided by their subscribers to third parties without subscriber's written consent, which
may be revoked at any time. Upon subscriber's request, service must provide the
subscriber's personally identifiable information maintained by the service to the
subscriber for verification and correction purposes. Upon subscriber's request, service
must inform the subscriber of the identity of the third party recipients of the subscriber's
personally identifiable information. Additionally, this act would prohibit knowing
disclosure of falsified information.
H.R. 1287
Social Security On-line Privacy Protection Act of 1996
Interactive computer services may not disclose Social Security account numbers or
personal information that is identifiable to an individual by means of his or her Social
Security number without the individual's written consent, which may be revoked at any
time. See "Regulating On-Line Databases."
HR 1330
American Family Privacy Act of 1997
Prohibits Federal officers and employees from providing access to social security account
statement information, personal earnings and benefits estimate statement information, or
tax return information of an individual through the Internet or without the written consent
of the individual, and to establish a commission to investigate the protection and privacy
afforded to certain Government records.
A Framework for Privacy Protection
Appendix A : Page 3
HR 1331
Social Security Information Safeguards Act of 1997
Requires the Commissioner of Social Security to assemble a panel of experts to assist the
Commissioner in developing appropriate mechanisms and safeguards to ensure
confidentiality and integrity of personal Social Security records made accessible to the
public.
H.R. 1367
Federal Internet Privacy Protection Act of 1997
Federal agencies may not make available through the Internet any record with respect to
an individual. For the purposes of this act, records include information with respect to
education, financial transactions, medical history, or employment history that contains
the name or identifier assigned to the individual.
H.R. 1813/S. 600
Personal Information Privacy Act of 1997
Persons may not buy, sell, or trade Social Security account numbers or use social security
account numbers for identification without written consent of the individual. Individual
must be informed of all purposes for which the number will be used and all persons to
whom the number will be known.
H.R. 2368
Data Privacy Act of 1997
This act creates an industry working group to establish guidelines to limit the collection
and commercial use of personally identifiable information obtained from individuals
through any interactive computer services. The guidelines shall apply to those Providers
or persons who voluntarily agree to such applicability. These guidelines shall require
interactive computer services to inform subscribers of the nature of the information
collected and allow subscribers to prevent disclosure of this information. Upon
subscriber's request, service must provide the subscriber's personally identifiable
information maintained by the service to the subscriber for verification and correction
purposes. Part of a more comprehensively conceived protection.
H.R. 2581
Social Security Privacy Act of 1997
Limits use of Social Security number. Requires disclosure of uses of SSN by businesses.
Protecting Medical Records
H.R.1815
Medical Privacy in the Age of New Technologies Act of 1997
Protects the privacy of health information in the age of genetic and other new
technologies.
A Framework for Privacy Protection
Appendix A : Page 4
H.R.2198
Genetic Privacy and Nondiscrimination Act of 1997
Would limit use and disclosure of genetic information by health insurance companies;
prohibit employers from attempting to acquire, or to use, genetic information, or "to
require a genetic test of an employee or applicant for employment" or to disclose the
information.
HR 2215
Genetic Nondiscrimination in the Workplace Act
Amends Fair Labor Standards Act to restrict employers in obtaining, disclosing, and
using of genetic information.
HR 2216
Genetic Protection in Insurance Coverage Act
Limits the disclosure and use of genetic information by life and disability insurers.
Prohibits insurers from requiring genetic tests, denying coverage, setting rates based on
genetics, using or maintain genetic info.
H.R. 2368
Data Privacy Act of 1997
No person may use, for commercial marketing purposes, any personal health or medical
information obtained through an interactive computer service without prior consent of the
individual. No person may display the social security account number of an individual,
through the use of an interactive computer services, except when part of a government
record available to the general public or pursuant to industry guidelines. Part of a more
comprehensively conceived protection.
H.R. 3755
Prescription Privacy Protection Act of 1998
Restricts disclosure of pharmacy records.
H.R.3900
Consumer Health and Research Technology (CHART) Protection Act
A bill to establish Federal penalties for prohibited uses and disclosures of individually
identifiable health information, to establish a right in an individual to inspect and copy
their own health information, and for other purposes. Allows disclosure to government
without warrant and researchers with little need.
H.R. 4250
Patient Protection Act of 1998
Republican Health Care bill. Sets weak standards for privacy, prohibits states from
passing stronger protections. Approved by the House 216-210 on July 24.
S. 89
Genetic Information Nondiscrimination in Health Insurance Act of 1997
Prohibits denial of insurance based on genetic information. Prohibits insurance
companies from requesting genetic information. Prohibits disclosing genetic information
without written consent.
S. 1368
Medical Information Privacy and Security Act
General medical privacy bill.
A Framework for Privacy Protection
Appendix A : Page 5
S. 2352
The Patient Privacy Rights Act
Repeals the "unique medical identifiers" requirement of the Health Insurance Portability
Law of 1996 (HIPAA).
Regulating Direct Marketing and Unsolicited E-Mail
HR 1748
Netizens Protection Act of 1997
Persons may not send unsolicited advertisements to an electronic mail address of an
individual with whom they lack a pre-existing or ongoing business or personal
relationship, unless the individual provides express invitation or permission. Persons
may not send unsolicited advertisements without providing the identity and the return
electronic mail address of the business or individual sending the message at the beginning
of the unsolicited ad.
H.R. 2368
Data Privacy Act of 1997
Persons may not send unsolicited advertisements without providing the identity, the
physical address, the electronic mail address, and the phone number of the initiator of the
message. The business or trade name must appear in the subject line of the message.
The guideline must provide a method by which an individual may choose to prohibit the
delivery of unsolicited commercial electronic mail. Part of a more comprehensively
conceived protection.
H.R. 4124
E-Mail User Protection Act of 1998
Anti-Spam bill
H.R. 4176
Digital Jamming Act of 1998
Anti-spam bill.
S. 771
Unsolicited Commercial Electronic Mail Choice Act of 1997
Persons sending unsolicited electronic mail advertisements must include the word
"advertisement" in the subject line of the message and include the sender's electronic mail
address, physical address, and telephone number in the body of the message. Interactive
computer services must provide a system for blocking any electronic mail that contains
the word "advertisement" in its subject line and inform its subscribers of this service.
Once such a blocking system is in place, interactive computer services must block
unsolicited advertisements at the request of the individual subscriber and may block
these advertisements upon its own initiative.
A Framework for Privacy Protection
Appendix A : Page 6
S. 875
Electronic Mailbox Protection Act of 1997
Persons may not transmit unsolicited electronic mail messages from an unregistered
electronic mail address or disguise the source of unsolicited electronic mail messages.
Persons may not send unsolicited electronic mail to individuals who have requested no
further messages and may not distribute that individual's electronic mail address to others
who intend to use the address for sending unsolicited electronic mail. Persons may not
send unsolicited electronic mail messages in contravention of the rules of an interactive
computer service or access the server of such a service to collect the electronic mail
addresses of its subscribers for the purpose of sending unsolicited electronic mail
advertisements.
Regulating On-Line Databases
H.R. 1226
Taxpayer Browsing Protection Act
Criminalizes "browsing" of taxpayer files by IRS employees. Passed by the House April
15, 1997. Placed on the Calendar in the Senate (CR S3346) on 4/17/97. Signed August 7,
1997 (PL 105-35).
HR 1287
Social Security On-line Privacy Protection Act of 1996
Prohibits computer service (such as Lexis-Nexis) from disclosing person's SSN without
permission. See "Protecting Personal Information."
H.R.2652
Collections of Information Anti-Piracy Act
Creates property right in databases of information, even if public domain information.
Approved by House on voice vote on May 19. Referred to Senate Judiciary Committee.
S.2291
Collections of Information Anti-Piracy Act
Creates new form of intellectual property for databases. Unfortunately, does not protect
individuals from whom information is collected, except from 3rd parties who would
misappropriate database from collector.
A Framework for Privacy Protection
Appendix B - Page 1
APPENDIX B: EXAMPLE P3P PROPOSAL
The following code should be placed in your default Web site page. This HTML
proposal will allow your visitors' browsers to read and understand the human readable
proposal which follows it.
<PROP
agreeID=“d273e4b4dc86577621ad58feaea0704e”
realm=“http://web.mit.edu/draco/www/group3/”
entity=“Group3-privacy”>
<USES>
<STATEMENT
VOC:purp=“0,1,2,3”
VOC:recpnt=“0”
VOC:id=“0”
consq=“It will help us better serve your interests as
you browse our website.”>
<DATA:REF
category=“1”/>
<WITH>
<PREFIX name=“User.”>
<DATA:REF name=“Home.Online.Email”
category=“1”/>
<DATA:REF name=“Home.Online.URI”
category=“1”/>
<DATA:REF name=“Business.Online.Email”
category=“1”/>
<DATA:REF name=“Business.Online.URI”
category=“1”/>
</PREFIX>
</WITH>
</USES>
<USES>
<STATEMENT
VOC:purp=“0,1,2,3”
VOC:recpnt=“0”
VOC:id=“1”>
<DATA:REF
category=“4”/>
</USES>
<USES>
<STATEMENT
VOC:purp=“0,1,2,3”
VOC:recpnt=“0”
VOC:id=“0”>
<WITH>
<PREFIX name=“ClickStream.”>
<DATA:REF
name=“Client_”
category=“5”/>
</PREFIX>
</WITH>
A Framework for Privacy Protection
Appendix B - Page 2
<WITH>
<PREFIX name=“Form.”>
<DATA:REF
name=“StoreNegotiation_”
category=“6”/>
<DATA:REF
name=“SearchText_”
category=“6”/>
</PREFIX>
</WITH>
<DATA:REF
category=“9”/>
</USES>
<VOC:DISCLOSURE
discuri=“http://web.mit.edu/draco/www/group3/privacypolicy.html”
access=“3”
other-disclosure=“0,1”>
<ASSURANCE org=““/>
</PROP>
GROUP3-PRIVACY'S PRIVACY STATEMENT
Group3-privacy makes the following statement for the Web pages at
<http://web.mit.edu/draco/www/group3/>. This is the human readable version of the
above P3P proposal.
Online Contact Information
Group3-privacy may collect the following data regarding a user's home:
an e-mail address or a URI.
Group3-privacy could collect information regarding a user's business:
an e-mail address or a URI.
Group3-privacy makes a statement regarding how a user might benefit by providing this
information:
"It will help us better serve your interests as you browse our website."
Group3-privacy uses this information for the purposes of completion and support of
current activity, Web site and system administration, customization of a Web site to an
individual and research and development.
This information may be distributed to ourselves and our agents.
Computer Information
We collect information about a users computer system and associate this information
with other identifiable information collected from the user. Group3-privacy uses this
information for the purposes of completion and support of current activity, Web site and
system administration, customization of a Web site to an individual and research and
development.
This information may be distributed to ourselves and our agents. Moreover, this
information could be used in such a way as to personally identify a user.
A Framework for Privacy Protection
Appendix B - Page 3
Clickstream and Navigation Data Information
Group3-privacy may collect the following types of passively gathered data: click stream
data collected on the server. Group3-privacy uses this information for the purposes of
completion and support of current activity, Web site and system administration,
customization of a Web site to an individual and research and development.
This information may be distributed to ourselves and our agents.
Transaction Data
Group3-privacy may collect the following types of information generated through
explicit transactions (i.e. search queries): stored negotiation history data or search query
data. Group3-privacy uses this information for the purposes of completion and support of
current activity, Web site and system administration, customization of a Web site to an
individual and research and development.
This information may be distributed to ourselves and our agents.
Content Data
Our Web site contains elements that would allow a user to communicate with another
user or to post information so that others may access it at
http://web.mit.edu/draco/www/group3/. Group3-privacy uses this information for the
purposes of completion and support of current activity, Web site and system
administration, customization of a Web site to an individual and research and
development.
This information may be distributed to ourselves and our agents.
General Disclosure
No user access to identifiable information is given.
Group3-privacy makes a disclosure in our policy posted at
http://web.mit.edu/draco/www/group3/ regarding the capability for a user to cancel or
renegotiate an existing agreement at a future time.
Group3-privacy makes a disclosure in our policy posted at
http://web.mit.edu/draco/www/group3/ regarding how long data is retained.
Download