Biometrics Aff / Neg: Practice Version

advertisement
MDAW 15
Biometrics
Biometrics Aff / Neg: Practice Version
Affirmative
Status Quo: 1AC ............................................................................................................................................................................................ 2
Harms: 1AC (1/3) ........................................................................................................................................................................................... 3
Harms: 1AC (2/3) ........................................................................................................................................................................................... 4
Harms: 1AC (3/3) ........................................................................................................................................................................................... 5
Solvency: 1AC (1/3) ....................................................................................................................................................................................... 6
Solvency: 1AC (2/3) ....................................................................................................................................................................................... 7
Solvency: 1AC (3/3) ....................................................................................................................................................................................... 8
Status Quo Ext: No Court Limits Now ............................................................................................................................................................. 9
Status Quo Ext: No Regulations Now ........................................................................................................................................................... 10
Harms Ext: First Amendment Rights ............................................................................................................................................................ 11
Harms Ext: NGI ............................................................................................................................................................................................ 12
Harms Ext: Privacy Rights (1/2) ................................................................................................................................................................... 13
Harms Ext: Privacy Rights (2/2) ................................................................................................................................................................... 14
Harms Ext: Race / Sorting ............................................................................................................................................................................ 15
Solvency Ext: Congress ............................................................................................................................................................................... 16
DA Ans: Terrorism........................................................................................................................................................................................ 17
Negative
Neg: Terror DA Link—1NC ........................................................................................................................................................................... 18
Neg: Terror DA Link—2NC ........................................................................................................................................................................... 19
Neg: Abuse Answers .................................................................................................................................................................................... 20
Neg: Privacy Answers .................................................................................................................................................................................. 21
Neg: Private Sources Biggest Threat ........................................................................................................................................................... 22
1
MDAW 15
Biometrics
Status Quo: 1AC
Observation One: The Status Quo
A. There are no existing meaningful constitutional or statutory barriers to the use of
biometric information
Laura K. Donohue, Associate Professor, Law, Georgetown University, “Technological Leap, Statutory Gap, and Constitutional Abyss:
Remote Biometric Identification Comes of Age,” MINNESOTA LAW REVIEW v. 97, 12—12, p. 556.
The past decade has witnessed a sudden explosion in remote biometric identification. Congress, however, even as it
has required the Executive to develop and use these technologies, has not placed meaningful limits on the use of such
powers. Gaps in the 1974 Privacy Act and its progeny, as well as the 1990 Computer Act, in conjunction with explicit
exemptions in the Privacy Act and the 2002 E-Government Act, remove most biometric systems from the statutes'
remit. As a matter of criminal law, Title III of the 1968 Omnibus Crime Control and Safe Streets Act and Title I of the 1986 Electronic Communications Privacy Act say nothing about RBI. In
the national security statutory realm, it is un-clear whether remote biometric technologies are currently included within the 1978 Foreign Intelligence Surveillance Act's definition of electronic
surveillance. The statute's dependence, moreover, on the distinction between U.S. persons and non-U.S. persons presents a direct challenge to the way in which RBI operates. At the same
Recourse to constitutional challenge provides little by way of respite:
Fourth Amendment jurisprudence fails to address the implications of RBI. Fifth Amendment rights against selfincrimination, First Amendment protections on the right to free speech and free assembly, and Fifth and Fourteenth
Amendment due process concerns similarly fall short.
time, principles enshrined in the statute appear inapplicable to the RBI context.
B. The use of biometric information is expanding rapidly—especially the use of facial
recognition technology
Jennifer Lynch, Staff Attorney, Electronic Frontier Foundation, Testimony before the Senate Committee on the Judiciary, Subcommittee on
Privacy, Technology, and the Law, 7—18—12, www.eff.org/files/filenode/jenniferlynch_eff-senate-testimony-face_recognition.pdf.
Although the collection of biometrics—including face recognition-ready photographs— seems like science fiction; it is
already a well-established part of our lives in the United States. The Federal Bureau of Investigation (FBI) and Department of Homeland Security (DHS)
each have the largest biometrics databases in the world ,and both agencies are working to add extensive facial
recognition capabilities. The FBI has partnered with several states to collect face recognition-ready photographs of all
suspects arrested and booked, and, in December 2011, DHS signed a $71 million dollar contract with Accenture to incorporate facial
recognition and allow real-time biometrics sharing with the Department of Justice (DOJ) and Department of Defense (DOD). State and
local law enforcement agencies are also adopting and expanding their own biometrics databases to incorporate face
recognition, and are using handheld mobile devices to allow biometrics collection in “the field. ”
C. The threat will only grow as biometric data is consolidated—the FBI’s Next Generation
Identification (NGI) system is the biggest threat
Jennifer Lynch, Staff Attorney, Electronic Frontier Foundation, Testimony before the Senate Committee on the Judiciary, Subcommittee on
Privacy, Technology, and the Law, 7—18—12, www.eff.org/files/filenode/jenniferlynch_eff-senate-testimony-face_recognition.pdf.
federal, state and local governments have been pushing to develop “multimodal” biometric systems that
collect and combine two or more biometrics (for example, photographs and fingerprints19), arguing that collecting multiple biometrics from each subject will make identification systems more
accurate. The FBI’s Next Generation Identification (NGI) database represents the most robust effort to introduce and streamline multimodal
biometrics collection. FBI has stated it needs “to collect as much biometric data as possible . . . and to make this information accessible to all levels of law enforcement, including International agencies.”
Accordingly, it has been working “aggressively to build biometric databases that are comprehensive and international in
scope.” The biggest and perhaps most controversial change brought about by NGI will be the addition of facerecognition ready photographs. The FBI has already started collecting such photographs through a pilot program with a handful of states. Unlike traditional mug shots, the new NGI photos may be taken
In the last few years,
from any angle and may include close-ups of scars, marks and tattoos. They may come from public and private sources, including from private security cameras, and may or may not be linked to a specific person’s record (for
NGI will allow law enforcement, correctional facilities, and criminal justice
agencies at the local, state, federal, and international level to submit and access photos, and will allow them to submit
photos in bulk. The FBI has stated that a future goal of NGI is to allow law-enforcement agencies to identify subjects in
“public datasets,” which could include publicly available photographs, such as those posted on Facebook or
elsewhere on the Internet. Although a 2008 FBI Privacy Impact Assessment (PIA) stated that the NGI/IAFIS photo database does not collect information from “commercial data aggregators,” the PIA
example, NGI may include crowd photos in which many subjects may not be identified).
acknowledged this information could be collected and added to the database by other NGI users such as state and local law-enforcement agencies. The FBI has also stated that it hopes to be able to use NGI to track people as they
move from one location to another.
2
MDAW 15
Biometrics
Harms: 1AC (1/3)
A. The expanded use of biometric technology risks unlimited, full-time warrant-less
urveillance
Kimberly N.
Brown, Associate Professor, Law, University of Baltimore, “Anonymity, Faceprints, and the Constitution,” GEORGE MASON LAW REVIEW v. 21 n. 2, 2014, p. 409.
Rapid technological advancement has dramatically expanded the warrantless powers of government to obtain
information about individual citizens directly from the private domain. Biometrics technology —such as voice recognition, hand measurement,
iris and retinal imaging, and facial recognition technology (“FRT”)—offers enormous potential for law enforcement and national security. But it comes at a cost. Although much of the
American public is complacent with government monitoring for security reasons, people also expect to go about daily
life in relative obscurity— unidentifiable to others they do not already know, do not care to know, or are not required to
know—so long as they abide by the law. The reality is quite different. The government and the private sector have the
capacity for surveillance of nearly everyone in America. As one commentator puts it, “soon there really will be nowhere to run and
nowhere to hide, short of living in a cave, far removed from technology.”
B. The unchecked use of facial recognition technology causes serious problems—acts
as a form of social control, causes emotional harm, and is prone to official abuse,
threatening our rights
Kimberly N.
Brown, Associate Professor, Law, University of Baltimore, “Anonymity, Faceprints, and the Constitution,” GEORGE MASON LAW REVIEW v. 21 n. 2, 2014, p. 434-436.
FRT enables users to trace a passerby’s life in real time—past, present, and future—through the relation
of faceprint algorithms with other data points. Although the American public’s reaction to the NSA’s Prism program was relatively muted, most people understand
the awkward feeling of being stared at on a bus. Constant surveillance by the government is more pernicious. It
discovers a person’s identity and then augments that information based on intelligence that today’s technology
renders limitless. The loss of anonymity that results from the detailed construction of a person’s identity through
ongoing monitoring can lead to at least three categories of harm, discussed below. First, as the panopticon suggests, ongoing identification
and tracking can adversely influence behavior. People involuntarily experience “self censorship and inhibition” in
response to the feeling of being watched. They “might think twice,” for example, “before visiting websites of extreme sports or watching sitcoms
glorifying couch potatoes if they felt this might result in higher insurance premiums.” The social norms that develop around expectations of surveillance can, in
turn, become a tool for controlling others. To be sure, social control is beneficial for the deterrence of crime and management of other negative behavior. Too much social
control, however, “can adversely impact freedom, creativity, and self-development.” Professor Jeffrey Rosen has explained that the
pressures of having one’s private scandals “outed” can push people toward socially influenced courses of action that,
without public disclosure and discussion, would never happen. They are less willing to voice controversial ideas or
associate with fringe groups for fear of bias or reprisal. With the sharing of images online, the public contributes to
what commentators have called “sousveillance” or a “reverse Panopticon” effect, whereby “the watched become the
watchers.” Second, dragnet-style monitoring can cause emotional harm. Living with constant monitoring is stressful,
inhibiting the subject’s ability to relax and negatively affecting social relationships. When disclosures involve
particularly sensitive physical or emotional characteristics that are normally concealed—such as “[g]rief, suffering, trauma, injury, nudity, sex,
urination, and defecation”—a person’s dignity and self-esteem is affected, and incivility toward that person increases. Third,
constant surveillance through modern technologies reduces accountability for those who use the data to make
decisions that affect the people they are monitoring. The collection of images for FRT applications is indiscriminate,
with no basis for suspecting a particular subject of wrongdoing. It allows users to cluster disparate bits of information
together from one or more random, unidentified images such that “[t]he whole becomes greater than the parts.” The individuals whose images are
captured do not know how their data is being used and have no ability to control the manipulation of their faceprints,
even though the connections that are made reveal new facts that the subjects did not knowingly disclose. The party
doing the aggregating gains a powerful tool for forming and disseminating personal judgments that render the subject
vulnerable to public humiliation and other tangible harms , including criminal investigation. Incorrect surveillance information can lead
to lost job opportunities, intense scrutiny at airports, false arrest, and denials of public benefits. In turn, a lack of
transparency, accountability, and public participation in and around surveillance activities fosters distrust in
government. The recent scandal and fractured diplomatic relations over NSA surveillance of U.S. allies is a case in point. Perhaps most troubling, FRT enhances users’ capacity to
identify and track individuals’ propensity to take particular actions, which stands in tension with the common law
presumption of innocence embodied in the Due Process Clause of the Fifth and Fourteenth Amendments. As described below,
prevailing constitutional doctrine does not account for the use of technology to identify, track, and predict the behavior
of a subject using an anonymous public image and big data correlations.
In the big data universe,
3
MDAW 15
Biometrics
Harms: 1AC (2/3)
C. The spread of biometric data access risks the rise of national surveillance state
Sunita Patel and Scott Platrowitz, attorneys, Center for Constitutional Rights, “How Far Will the Government Go in Collecting and Storing All
Our Personal Data?,” COMMON DREAMS, 11—10—11, www.commondreams.org/views/2011/11/10/how-far-will-government-go-collectingand-storing-all-our-personal-data.
The newly-released documents also reveal that not only do the Department of Homeland Security (DHS) and the
Department of Justice already have access to this personal information, but so do the Department of Defense, the U.S.
Coast Guard, foreign governments, and potentially the Nuclear Regulatory Commission . Indeed, CJIS has informationsharing relationships with more than 75 countries. This ubiquitous world-wide surveillance of anyone and everyone
should serve as a wake up call for what the future may hold. Rapid deployment of the new technologies uncovered in the
FOIA records brings us closer to an extensive and inescapable surveillance state, where we blindly place our hands on
electronic devices that capture our digital prints, stare into iris scanning devices that record the details of our eyes,
and have pictures taken of different angles of our faces so that the FBI and other federal agencies can store and use
such information. Some state and local officials have heavily resisted sharing the fingerprints of non-citizens with immigration authorities, as
it can cause community members to fear reporting crime, break up families through unjust detentions and deportations, and lead to law
enforcement abuse of authority or racial profiling. Yet the FBI and DHS have prohibited states and localities from placing limits
on the FBI’s use of the data. Recently, high-ranking Immigration and Customs Enforcement official Gary Mead was asked by local
advocates at a debate in New York why the agency insisted on overriding the governors’ request to prevent federal sharing of immigrant prints.
He responded that allowing the FBI to share data with other agencies is the “price of admission” for joining “the FBI club.” But none of us—
those paying the price of having our personal data collected, analyzed, and shared—want the FBI club to
indiscriminately share our personal information. Expanding NGI raises numerous concerns about government invasion
of privacy (because of the access, retention, use, and sharing of biometric information without individual consent or
knowledge), the widening of federal government surveillance (the NGI database will hold information that can be used
to track individuals long into the future), and the increased risk of errors and vulnerability to hackers and identity
thieves.
D. Privacy is key to human dignity—that’s a VTL impact y’all
Michael McFarland, S.J., a computer scientist with extensive liberal arts teaching experience and former 31st president of the College of
the Holy Cross, “Why We Care about Privacy,” 6—12, http://www.scu.edu/ethics/practicing/focusareas/technology/internet/privacy/why-careabout-privacy.html
Privacy is important for a number of reasons. Some have to do with the consequences of not having privacy. People can be harmed or
debilitated if there is no restriction on the public's access to and use of personal information. Other reasons are more
fundamental, touching the essence of human personhood. Reverence for the human person as an end in itself and as
an autonomous being requires respect for personal privacy. To lose control of one's personal information is in some
measure to lose control of one's life and one's dignity. Therefore, even if privacy is not in itself a fundamental right, it is
necessary to protect other fundamental rights. In what follows we will consider the most important arguments in favor of privacy.
Protection from the Misuse of Personal Information There are many ways a person can be harmed by the revelation of sensitive
personal information. Medical records, psychological tests and interviews, court records, financial records--whether from banks, credit
bureaus or the IRS--welfare records, sites visited on the Internet and a variety of other sources hold many intimate details of a person's life. The
revelation of such information can leave the subjects vulnerable to many abuses. Good information is needed for good decisions. It might seem
like the more information the better. But sometimes that information is misused, or even used for malicious purposes. For
example, there is a great deal of misunderstanding in our society about mental illness and those who suffer from it. If it becomes known that a
person has a history of mental illness, that person could be harassed and shunned by neighbors. The insensitive remarks and behavior of others
can cause the person serious distress and embarrassment. Because of prejudice and discrimination, a mentally ill person who is quite capable
of living a normal, productive life can be denied housing, employment and other basic needs. Similarly someone with an arrest record, even
where there is no conviction and the person is in fact innocent, can suffer severe harassment and discrimination. A number of studies have
shown that employers are far less likely to hire someone with an arrest record, even when the charges have been dropped or the person has
been acquitted. In addition, because subjects can be damaged so seriously by the release of sensitive personal information, they are also
vulnerable to blackmail and extortion by those who have access to that information.
4
MDAW 15
Biometrics
Harms: 1AC (3/3)
E. Widespread biometric surveillance technology would destroy anonymity and will be
used to criminalize populations
Kyle Chayka, journalist, “Biometric Surveillance Means Someone Is Always Watching,” NEWSWEEK, 4—25—14, Expanded Academic
ASAP.
What would a world look like with comprehensive biometric surveillance? "If cameras connected to databases can do
face recognition, it will become impossible to be anonymous in society," Lynch says. That means every person in the U.S.
would be passively tracked at all times. In the future, the government could know when you use your computer, which
buildings you enter on a daily basis, where you shop and where you drive. It's the ultimate fulfillment of Big Brother
paranoia. But anonymity isn't going quietly. Over the past several years, mass protests have disrupted governments in countries across the
globe, including Egypt, Syria and Ukraine. "It's important to go out in society and be anonymous," Lynch says. But face recognition could make
that impossible. A protester in a crowd could be identified and fired from a job the next day, never knowing why. A mistaken face-print algorithm
could mark the wrong people as criminals and force them to escape the specter of their own image. If biometric surveillance is allowed to
proliferate unchecked, the only option left is to protect yourself from it. Artist Zach Blas has made a series of bulbous masks, aptly named the
"Facial Weaponization Suite," that prepare us for just such a world. The neon-colored masks both disguise the wearer and make the rest of us
more aware of how our faces are being politicized. "These technologies are being developed by police and the military to
criminalize large chunks of the population," Blas says of biometrics. If cameras can tell a person's identity, background and
whereabouts, what's to stop the algorithms from making the same mistakes as governmental authorities, giving racist
or sexist biases a machine-driven excuse? "Visibility," he says, "is a kind of trap."
F. This type of discrimination results in massive violence and perpetual war
Eduardo Mendieta, SUNY-Stony Brook, “’To Make Live and to Let Die’—Foucault on Racism,” Meeting of the Foucault Circle, APA Central
Division Meeting, Chicago IL, 4—25—02, www.stonybrook.edu/commcms/philosophy/people/faculty_pages/docs/foucault.pdf.
This is where racism intervenes, not from without, exogenously, but from within, constitutively. For the emergence of biopower as the form of a
new form of political rationality, entails the inscription within the very logic of the modern state the logic of racism. For racism grants, and here I am quoting: “the conditions
for the acceptability of putting to death in a society of normalization. Where there is a society of normalization, where there is a power that
is, in all of its surface and in first instance, and first line, a bio-power, racism is indispensable as a condition to be able to put to death
someone, in order to be able to put to death others. The homicidal [meurtrière] function of the state, to the degree that the
state functions on the modality of bio-power, can only be assured by racism “(Foucault 1997, 227) To use the formulations from his 1982 lecture “The Political
Technology of Individuals” –which incidentally, echo his 1979 Tanner Lectures –the power of the state after the 18th century, a power which is enacted through the
police, and is enacted over the population, is a power over living beings, and as such it is a biopolitics. And, to
quote more directly, “since the population is nothing more than what the state takes care of for its own sake, of
course, the state is entitled to slaughter it, if necessary. So the reverse of biopolitics is thanatopolitics .”
(Foucault 2000, 416). Racism, is the thanatopolitics of the biopolitics of the total state. They are two sides of one same political technology, one same
political rationality: the management of life, the life of a population, the tending to the continuum of life of a people. And with the inscription of racism within the
state of biopower, the long history of war that Foucault has been telling in these dazzling lectures has made a new turn: the war of
peoples, a war against invaders, imperials colonizers, which turned into a war of races, to then turn into a war of classes, has now
turned into the war of a race, a biological unit, against its polluters and threats. Racism is the means by which bourgeois political
power, biopower, re-kindles the fires of war within civil society. Racism normalizes and medicalizes war.
Racism makes war the permanent condition of society, while at the same time masking its weapons of
death and torture. As I wrote somewhere else, racism banalizes genocide by making quotidian the lynching of suspect
threats to the health of the social body. Racism makes the killing of the other, of others, an everyday
occurrence by internalizing and normalizing the war of society against its enemies. To protect society
entails we be ready to kill its threats, its foes, and if we understand society as a unity of life, as a
continuum of the living, then these threat and foes are biological in nature .
5
MDAW 15
Biometrics
Solvency: 1AC (1/3)
Plan
Congress should impose restrictions on the collection, use, and retention of biometric
identification data, modeled after the Wiretap Act.
Solvency
A. Congress should use the wiretapping model to curtail the use of biometric
surveillance techniques
Jennifer Lynch, Staff Attorney, Electronic Frontier Foundation, Testimony before the Senate Committee on the Judiciary, Subcommittee on
Privacy, Technology, and the Law, 7—18—12, www.eff.org/files/filenode/jenniferlynch_eff-senate-testimony-face_recognition.pdf.
The over-collection of biometrics has become a real concern, but there are still opportunities—both technological and
legal—for change. Given the current uncertainty of Fourth Amendment jurisprudence in the context of biometrics and the fact
that biometrics capabilities are undergoing “dramatic technological change,” legislative action could be a good solution to curb the
overcollection and over-use of biometrics in society today and in the future. If so, the federal government’s response to
two seminal wiretapping cases in the late 60s could be used as a mode l. In the wake of Katz v. United States and New York v.
Berger, the federal government enacted the Wiretap Act, which laid out specific rules that govern federal wiretapping,
including the evidence necessary to obtain a wiretap order, limits on a wiretap’s duration, reporting requirements, and
a notice provision. Since then, law enforcement’s ability to wiretap a suspect’s phone or electronic device has been
governed primarily by statute rather than Constitutional case law. Congress could also look to the Video Privacy
Protection Act (VPPA), enacted in 1988, which prohibits the “wrongful disclosure of video tape rental or sale records” or “similar audio-visual
materials,” requires a warrant before a video service provider may disclose personally identifiable information to law enforcement, and includes a
civil remedies enforcement provision
B. Rigorous standards guaranteeing privacy and directing the use of biometric
technology are necessary—failure to do so ensures the destruction of privacy with
minimal security benefits
William Abernathy and Lee Tien, staff, BIOMETRICS: WHO’S WATCHING YOU? Electronic Frontier Foundation, 9—03,
https://www.eff.org/wp/biometrics-whos-watching-you.
Our Major Concerns
Biometric technology is inherently individuating and interfaces easily to database technology, making privacy
violations easier and more damaging. If we are to deploy such systems, privacy must be designed into them from the
beginning, as it is hard to retrofit complex systems for privacy. Biometric systems are useless without a well-considered threat model. Before
deploying any such system on the national stage, we must have a realistic threat model, specifying the categories of
people such systems are supposed to target, and the threat they pose in light of their abilities, resources, motivations
and goals. Any such system will also need to map out clearly in advance how the system is to work, in both in its successes and in its failures. Biometrics are no
substitute for quality data about potential risks. No matter how accurately a person is identified, identification alone
reveals nothing about whether a person is a terrorist. Such information is completely external to any biometric ID system. Biometric identification
is only as good as the initial ID. The quality of the initial "enrollment" or "registration" is crucial. Biometric systems are only as good as the initial identification, which in any
foreseeable system will be based on exactly the document-based methods of identification upon which biometrics are supposed to be an improvement. A terrorist with a fake passport would be
issued a US visa with his own biometric attached to the name on the phony passport. Unless the terrorist A) has already entered his biometrics into the database, and B) has garnered enough
suspicion at the border to merit a full database search, biometrics won't stop him at the border. Biometric identification is often overkill for the task at hand .
It is not necessary to identify a person (and to create a record of their presence at a certain place and time) if all you really want to know is whether they're entitled to do something or be
somewhere. When in a bar, customers use IDs to prove they're old enough to drink, not to prove who they are, or to create a record of their presence. Some biometric technologies are
discriminatory.A nontrivial percentage of the population cannot present suitable features to participate in certain biometric systems. Many people have fingers that simply do not "print well."
Even if people with "bad prints" represent 1% of the population, this would mean massive inconvenience and suspicion for that minority. And scale matters. The INS, for example, handles about
1 billion distinct entries and exits every year. Even a seemingly low error rate of 0.1% means 1 million errors, each of which translates to INS resources lost following a false lead.
Biometric systems' accuracy is impossible to assess before deployment . Accuracy and error rates published by biometric technology vendors are
not trustworthy, as biometric error rates are intrinsically manipulable. Biometric systems fail in two ways: false match (incorrectly matching a subject with someone else's reference sample) and
false non-match (failing to match a subject with her own reference sample). There's a trade-off between these two types of error, and biometric systems may be "tuned" to favor one error type
over another. When subjected to real-world testing in the proposed operating environment, biometric systems frequently fall short of the performance promised by vendors. The cost of failure is
high. If you lose a credit card, you can cancel it and get a new one. If you lose a biometric, you've lost it for life. Any biometric system must be built to the highest levels of data security,
including transmission that prevents interception, storage that prevents theft, and system-wide architecture to prevent both intrusion and compromise by corrupt or deceitful agents within the
organization. Despite
these concerns, political pressure for increasing use of biometrics appears to be informed and driven
more by marketing from the biometrics industry than by scientists. Much federal attention is devoted to deploying
biometrics for border security. This is an easy sell, because immigrants and foreigners are, politically speaking, easy targets. But once a system is created, new
uses are usually found for it, and those uses will not likely stop at the border. With biometric ID systems, as with national ID systems, we must be
wary of getting the worst of both worlds: a system that enables greater social surveillance of the population in general,
but does not provide increased protection against terrorists.
6
MDAW 15
Biometrics
Solvency: 1AC (2/3)
C. Restrictions are insufficient—the plan is necessary to prevent abuses
GINGER McCALL AUG. 29, 2013. attorney and founder, Advocates for Accountable Democracy. The face scan arrives. New York Times.
http://www.nytimes.com/2013/08/30/opinion/the-face-scan-arrives.html?_r=0
The Department of Homeland Security is not the only agency developing facial-surveillance capacities. The Federal Bureau of Investigation
has spent more than $1 billion on its Next Generation Identification program, which includes facial-recognition
technology. This technology is expected to be deployed as early as next year and to contain at least 12 million searchable photos . The bureau
has partnerships with at least seven states that give the agency access to facial-recognition-enabled databases of
driver’s license photos. State agencies are also participating in this technological revolution, though not yet using video
cameras. On Monday, Ohio’s attorney general, Mike DeWine, confirmed reports that law enforcement officers in his state, without public notice, had
deployed facial-recognition software on its driver’s license photo database, ostensibly to identify criminal suspects . A total of 37 states have
enabled facial-recognition software to search driver’s license photos, and only 11 have protections in place to limit
access to such technologies by the authorities. Defenders of this technology will say that no one has a legitimate expectation of privacy
in public. But as surveillance technology improves, the distinction between public spaces and private spaces becomes
less meaningful. There is a vast difference between a law enforcement officer’s sifting through thousands of hours of
video footage in search of a person of interest, and his using software to instantly locate that person anywhere, at any
time. A person in public may have no reasonable expectation of privacy at any given moment, but he certainly has a reasonable
expectation that the totality of his movements will not be effortlessly tracked and analyzed by law enforcement without
probable cause. Such tracking, as the federal appellate judge Douglas H. Ginsburg once ruled, impermissibly “reveals an intimate picture of the
subject’s life that he expects no one to have — short perhaps of his wife.” Before the advent of these new technologies, time and effort created
effective barriers to surveillance abuse. But those barriers are now being removed. They must be rebuilt in the law. Two policies are necessary. First ,
facial-recognition databases should be populated only with images of known terrorists and convicted felons. Driver’s
license photos and other images of “ordinary” people should never be included in a facial-recognition database
without the knowledge and consent of the public. Second, access to databases should be limited and monitored. Officers
should be given access only after a court grants a warrant. The access should be tracked and audited. The authorities
should have to publicly report what databases are being mined and provide aggregate numbers on how often they are
used. We cannot leave it to law enforcement agencies to determine, behind closed doors, how these databases are used. With the right safeguards,
facial-recognition technology can be employed effectively without sacrificing essential liberties.
D. Legislation is necessary—we cannot rely on the courts to navigate changes in
information technology
Stephen Rushin, Visting Assistant Professor, Law, University of Illinois, “The Legislative Response to Mass Police Surveillance,”
BROOKLYN LAW REVIEW v. 79, Fall 2013, p. 41-42.
Any future judicial response must be coupled with state legislation. Even if the judiciary eventually accepts some version
of the mosaic theory in interpreting the Fourth Amendment, we should not expect the Court to hand down de-tailed
regulations for the use of these technologies. Justice Alito's concurrence in Jones is telling. His proposal to regulate the efficiency of
surveillance technologies would only control data retention. n261 And the amount of data that a police department could reasonably retain
without a warrant would vary from one situation to the next based upon the relative seriousness of the possible crime at issue. n262 This
barely scratches the surface of broader problems posed by the digitally efficient state. Under what conditions should we
permit extensive data retention? When should we limit this kind of retention? Is data aggregation more acceptable as long as the data is not
cross-referenced with other databases, thereby personally identifying individuals? Should we regulate law enforcement's access to this personal
data? And where should this data be stored? Even my original proposal for judicial regulation of mass police surveillance only addressed a
handful of these questions. I recommended that courts require police to develop clear data retention policies that are tailored to only retain data
as long as necessary to serve a legitimate law enforcement [*42] purpose. n263 Like Alito's proposal, such a standard would vary according to
the seriousness of the crime under investigation and the individual circumstance. I also argued that in cases where police retain surveillance
data without a warrant through electronic means, they should have a legitimate law enforcement purpose before cross-referencing that data with
other databases for the purposes of identifying individuals. n264 Both the Jones concurrence and my previous proposal would establish a broad
judicial principle mandating that police regulate data retention according to the seriousness of the crime under investigation and the legitimate
need for such retention. This type of judicial response is limited in nature. Legislative bodies would likely need to step in to
provide more detailed standards. The legislative branch has several advantages over the judiciary that make it
appropriate for this type of detailed policy building. The legislature has a wider range of enforcement mechanisms than
the judiciary. The legislature can mandate in-depth and regular oversight. And it has the resources and tools to
develop extensive, complex regulations. As a result, the legislature is the best-positioned branch to address some of
the critical issues raised by the digitally effi-cient investigative state, such as data storage, access, and sharing
policies.
7
MDAW 15
Biometrics
Solvency: 1AC (3/3)
E. Congress needs to establish guidelines restricting the use of FRT
Bridget Mallon, “’Every Breath You Take, Every More You Make, I’ll Be Watching You’: The Use of Factial Recognition Technology,”
VILLANOVA LAW REVIEW v. 48, 2003, p. 985-987.
One of the most significant problems with the already existing face recognition systems is the lack of laws or regulaions setting guidelines for their uses. n166 As discussed previously, while its use may be protected by the Constitution, there still
remains a need to regulate biometric technology. n167 Many people believe that the use of the technology may infringe on
individuals' privacy rights. n168 [*986] Another issue concerning the use of face recognition technology is what happens with all of the
infor-mation that is gathered. n169 Although the government currently maintains that it automatically discards all faces that are not a match,
there is growing concern that, with the expanding technology, the government will begin maintaining files on all of the faces scanned into the
databases. n170 The same fears of information gathering are even more prevalent when face recognition technology makes its way into the
hands of private citizens. n171 Many have expressed concern that in the hands of private citizens, face recognition technology will allow the
general public to maintain vast databases of information on individuals, retrievable the moment a face is scanned and a match is made. n172
This technology has the potential to allow third parties to monitor constantly the movements of individuals, thereby affording them no privacy.
n173 This concern over the lack of regulation led to the Congressional Privacy Caucus, formed in an effort to discuss and investigate current
privacy issues, with a focus on maintaining personal privacy. n174 Even individuals in the industry have raised this concern over a
lack of individual privacy. At least one maker of face recognition technology has called for the regulation of its use,
focusing on notifying individuals that they are being monitored. n175 [*987] Congress must establish a set of guidelines
that regulate the use of this technology by both the govern-ment and private citizens. n176 As the makers of the technology have
recognized the need for regulation, there is no doubt that the legislature is not far behind. n177 Furthermore, although the use of face
recognition technology remains untested in the court system, its expansion virtually assures that it will not remain untested for long. n178
8
MDAW 15
Biometrics
Status Quo Ext: No Court Limits Now
Fourth amendment rights don’t protect against mass surveillance like biometrics.
Dr. Marc Jonathan Blitz, attorney “Video Surveillance and the Constitution of Public Space: Fitting the Fourth Amendment to a World that
Tracks Image and Identity,” TEXAS LAW REVIEW v. 82, 5—04
But while such constitutional limits on wide-scale video surveillance may seem intuitively reasonable and necessary, contemporary Fourth
Amendment jurisprudence is ill-equipped to provide or even delineate them for at least two reasons. The first is that mass
video surveillance occurs in the public realm-in streets, parks, and highways-where courts have been reluctant to find
that individuals have reasonable expectations of privacy, at least in that information which they fail to conceal.37 Unlike random
stops and searches by government officials, extensive video surveillance does not dig beneath the visible surface that people project to the
world. As a consequence, contemporary Fourth Amendment jurisprudence differentiates pervasive video surveillance from more familiar
mass suspicion less searches in one crucial respect: by holding that it is not a "search" at all.38 Fourth Amendment "searches,"
according to the Supreme Court's current test, do not include all investigations of the sort an English speaker might describe as a "search., 39
As the Supreme Court emphasized in its landmark decision in Katz v. United States, which still provides the key legal test for what counts as a
"search," "what a person knowingly exposes to the public ... is not a subject of Fourth Amendment protection.'4° Thus, even when police
carefully scan a crowd with binoculars, in search of a particular person, they are not engaging in a Fourth Amendment "search. ' 4 l Fourth
Amendment interests are implicated only when the government uncovers things that people conceal. Because the Fourth Amendment offers
protection only against suspicion less searches and seizures-and not against suspicion less examinations (no matter how rigorous)-public
camera networks would seem to be outside of the Fourth Amendment's ambit, at least as long as their focus remains on public space and does
not wander into private homes, offices, or other enclosed areas
Current Fourth Amendment jurisprudence imposes no limits of biometric databases
Margaret Hu, Visiting Assistant Professor, Law, Duke University, “Biometric ID Cybersurveillance,” INDIANA LAW JOURNAL v. 88, Fall 20 13,
p. 1481-1482.
Yet, the existing Fourth Amendment jurisprudence is tethered to a "reasonable expectation of privacy" test that does
not appear to restrain the comprehensive, suspicionless amassing of databases that concern the biometric data,
movements, activities, and other personally identifiable information of individuals. At the same time, most scholars agree that
the Fourth Amendment should protect ordinary citizens from mass, suspicionless surveillance and cyber-surveillance "fishing expeditions" by the
government. Any attempt to grapple with the consequences of modern cybersurveillance, therefore, should attempt to delineate how
surveillance is administratively and technologically implemented through increasingly normalized mechanisms of identity tracking. Consequently,
it is necessary to consider what role, if any, the Fourth Amendment will play in restraining a rapidly evolving bureaucratized cyber-surveillance
movement that now constitutes what some scholars have described as the post-9/11 "national surveillance state."
9
MDAW 15
Biometrics
Status Quo Ext: No Regulations Now
Congress has failed to act—creates serious cause for concern
Laura K. Donohue, Associate Professor, Law, Georgetown University, “Technological Leap, Statutory Gap, and Constitutional Abyss:
Remote Biometric Identification Comes of Age,” MINNESOTA LAW REVIEW v. 97, 12—12, p. 414-416.
Despite the explosion of federal initiatives in this area, Congress has been virtually silent on the many current and
potential uses of FRT and related technologies. No laws directly address facial recognition - much less the pairing of
facial recognition with video surveillance - in either the criminal law or foreign intelligence realm. Many of the existing
limits placed on the collection of personally identifiable information do not apply. Only a handful of hearings has even
questioned the use of biometrics in the national security or law enforcement context. The absence of a statutory
framework is a cause for significant concern. Facial recognition represents the first of a series of next generation
biometrics, such as hand geometry, iris, vascular patterns, hormones, and gait, which, when paired with surveillance of public space, give rise
to unique and novel questions of law and policy. 1 These constitute what can be considered Remote Biometric Identification (RBI). That is, they
give the government the ability to ascertain the identity (1) of multiple people; (2) at a distance; (3) in public space; (4) absent notice and
consent; and (5) in a continuous and on-going manner. As such, RBI technologies present capabilities significantly different from that which the
government has held at any point in U.S. history. Hitherto, identification techniques centered on what might be called Immediate Biometric
Identification (IBI) - or the use of biometrics to determine identity at the point of arrest, following conviction, or in conjunction with access to
secure facilities. Fingerprint is the most obvious example of IBI, although more recent forays into palm prints fall with-in this class. DNA
technologies that require individuals to provide saliva, skin, or other samples for analysis also can be considered as part of IBI. Use of
technology for IBI, in contrast to RBI, tends to be focused (1) on a single individual; (2) close-up; (3) in relation either to custodial detention or in
the context of a specific physical area related to government activity; (4) in a manner often involving notice and often consent; and (5) is a onetime or limited occurrence. The types of legal and policy questions raised by RBI differ from those accompanying IBI. What we are witnessing,
as a result of the expansion from IBI to RBI, is a sea change in how we think about individuals in public space. Congress has yet to grapple with
the consequences.
Biometrics capabilities are far outstripping our legal framework
Tana Ganeva, managing editor, “5 Things You Should Know About the FBI's Massive New Biometric Database,” ALTERNET, 1—8—12,
www.alternet.org/story/153664/5_things_you_should_know_about_the_fbi%27s_massive_new_biometric_database.
An agency powerpoint presented at a 2011 biometrics conference outlines some of the sophisticated technology in the FBI's face recognition
initiatives. There's software that distinguishes between twins; 3-D face capture that expands way beyond frontal, two-dimensional mugshots;
face-aging software; and automated face detection in video. The report also says agencies can ID individuals in "public datasets,"
which privacy advocates worry could potentially include social media like Facebook. Meanwhile, existing laws are
rarely up to speed with galloping technological advances in surveillance, say privacy advocates. At this point, "You
just have to rely on law enforcement to do the right thing," Lynch says.
There are no law enforcement regulations governing FRT
Al Franken, U.S. Senator, Statement before the Senate Judiciary Committee, Subcommittee on Privacy, Technology and the Law, 7—18—
12, www.judiciary.senate.gov/imo/media/doc/12-7-8FrankenStatement.pdf,.
I called this hearing to raise awareness about the fact that facial recognition already exists right here, today, and we need to think
about what that means for our society. I also called this hearing to call attention to the fact that our federal privacy laws
are almost totally unprepared to deal with this technology. Unlike what we have in place for wiretaps and other
surveillance devices, there is no law regulating law enforcement use of facial recognition technology. And current
Fourth Amendment case law generally says that we have no reasonable expectation of privacy in what we voluntarily
expose to the public — yet we can hardly leave our houses in the morning without exposing our faces to the public. So law enforcement
doesn’t need a warrant to use this technology on someone. It might not even need to have a reasonable suspicion that the subject has been
involved in a crime.
10
MDAW 15
Biometrics
Harms Ext: First Amendment Rights
Biometric surveillance technologies will be targeted at political dissidents—will be used
to stifle First Amendment rights
Al Franken, U.S. Senator, Statement before the Senate Judiciary Committee, Subcommittee on Privacy, Technology and the Law, 7—18—
12, www.judiciary.senate.gov/imo/media/doc/12-7-8FrankenStatement.pdf.
Now many of you may be thinking that that’s an excellent thing. I agree. But unless law enforcement facial recognition programs are
deployed in a very careful manner, I fear that these gains could eventually come at a high cost to our civil liberties . I fear
that the FBI pilot could be abused to not only identify protesters at political events and rallies, but to target them for
selective jailing and prosecution, stifling their First Amendment rights. Curiously enough, a lot of the presentations on
this technology by the Department of Justice show it being used on people attending political events or other public
gatherings. I also fear that without further protections, facial recognition technology could be used on unsuspecting
civilians innocent of any crime — invading their privacy and exposing them to potential false identifications. Since 2010,
the National Institute of Justice, which is a part of DOJ, has spent $1.4 million to develop facial recognition-enhanced binoculars that can be
used to identify people at a distance and in crowds. It seems easy to envision facial recognition technology being used on innocent civilians
when all an officer has to do is look at them through his binoculars.
11
MDAW 15
Biometrics
Harms Ext: NGI
NGI contains too much data
Jennifer Lynch, April 14 2014, FBI Plans to Have 52 Million Photos in its NGI Face Recognition Database by Next Year, Electric Frontier
Foundation, https://www.eff.org/deeplinks/2014/04/fbi-plans-have-52-million-photos-its-ngi-face-recognition-database-next-year
NGI builds on the FBI’s legacy fingerprint database—which already contains well over 100 million individual records—and has
been designed to include multiple forms of biometric data, including palm prints and iris scans in addition to
fingerprints and face recognition data. NGI combines all these forms of data in each individual’s file, linking them to
personal and biographic data like name, home address, ID number, immigration status, age, race, etc . This immense
database is shared with other federal agencies and with the approximately 18,000 tribal, state and local law
enforcement agencies across the United States. The records we received show that the face recognition component of NGI
may include as many as 52 million face images by 2015. By 2012, NGI already contained 13.6 million images representing
between 7 and 8 million individuals, and by the middle of 2013, the size of the database increased to 16 million images. The new
records reveal that the database will be capable of processing 55,000 direct photo enrollments daily and of conducting
tens of thousands of searches every day.
NGI leads to innocent people being suspected for crimes they have no relation to.
Jennifer Lynch, April 14 2014, FBI Plans to Have 52 Million Photos in its NGI Face Recognition Database by Next Year, Electric Frontier
Foundation, https://www.eff.org/deeplinks/2014/04/fbi-plans-have-52-million-photos-its-ngi-face-recognition-database-next-year
There are several reasons to be concerned about this massive expansion of governmental face recognition data
collection. First, as noted above, NGI will allow law enforcement at all levels to search non-criminal and criminal face
records at the same time. This means you could become a suspect in a criminal case merely because you applied for a
job that required you to submit a photo with your background check . Second, the FBI and Congress have thus far failed
to enact meaningful restrictions on what types of data can be submitted to the system, who can access the data, and
how the data can be used. For example, although the FBI has said in these documents that it will not allow non-mug shot
photos such as images from social networking sites to be saved to the system, there are no legal or even written FBI policy
restrictions in place to prevent this from occurring. As we have stated before, the
Privacy Impact Assessment for NGI’s face
recognition component hasn’t been updated since 2008, well before the current database was even in development. It
cannot therefore address all the privacy issues impacted by NGI. Finally, even though FBI claims that its ranked candidate list
prevents the problem of false positives (someone being falsely identified), this is not the case. A system that only purports to
provide the true candidate in the top 50 candidates 85 percent of the time will return a lot of images of the wrong people. We
know from researchers that the risk of false positives increases as the size of the dataset increases—and, at 52 million images, the
FBI’s face recognition is a very large dataset. This means that many people will be presented as suspects for crimes they
didn’t commit. This is not how our system of justice was designed and should not be a system that Americans tacitly
consent to move towards.
12
MDAW 15
Biometrics
Harms Ext: Privacy Rights (1/2)
Biometrics focuses on individual interfaces and databases which increases attacks on
privacy
(Electronic
Frontier Foundation, 9—03,https://www.eff.org/wp/biometrics-whos-watching-you)
Biometrics refers to the automatic identification or identity verification of living persons using their enduring physical or
behavioral characteristics. Biometric technology is inherently individuating and interfaces easily to database
technology, making privacy violations easier and more damaging. If we are to deploy such systems, privacy must be
designed into them from the beginning, as it is hard to retrofit complex systems for privacy. Documents that the FBI turned
over only after a Freedom of Information Act (FOIA) lawsuit filed by the Center for Constitutional Rights, the National Day Labor Organizing
Network, and the Benjamin Cardozo Immigrant Justice Clinic reveal that the FBI views massive biometric information collection as a goal in
itself. The FBI’s Criminal Justice Information Services division (CJIS) has already conducted a test study with latent fingerprints and palm prints,
collected more than one million palm prints, scheduled an iris scan pilot program, and plans future deployment of technology nationwide to
collect other biometric data like scars, marks, tattoos, and facial measurements (.pdf). What’s more, the government continues to expand
domestic use of FBI Mobile (.pdf) scanners initially used in Iraq and Afghanistan. The newly-released documents also reveal that not only do the
Department of Homeland Security (DHS) and the Department of Justice already have access to this personal information, but so do the
Department of Defense, the U.S. Coast Guard, foreign governments, and potentially the Nuclear Regulatory Commission. Indeed, CJIS has
information-sharing relationships with more than 75 countries. This ubiquitous world-wide surveillance of anyone and everyone should serve as
a wake up call for what the future may hold. Rapid deployment of the new technologies uncovered in the FOIA records brings us closer to an
extensive and inescapable surveillance state, where we blindly place our hands on electronic devices that capture our digital prints, stare into iris
scanning devices that record the details of our eyes, and have pictures taken of different angles of our faces so that the FBI and other federal
agencies can store and use such information. Some state and local officials have heavily resisted (.pdf) sharing the fingerprints of non-citizens
with immigration authorities, as it can cause community members to fear reporting crime, break up families through unjust detentions and
deportations, and lead to law enforcement abuse of authority or racial profiling. Yet the FBI and DHS have prohibited states and localities from
placing limits on the FBI’s use of the data. Recently, high-ranking Immigration and Customs Enforcement official Gary Mead was asked by local
advocates at a debate in New York why the agency insisted on overriding the governors’ request to prevent federal sharing of immigrant prints.
He responded that allowing the FBI to share data with other agencies is the “price of admission” for joining “the FBI club.” But none of us—those
paying the price of having our personal data collected, analyzed, and shared—want the FBI club to indiscriminately share our personal
information. Expanding NGI raises numerous concerns about government invasion of privacy (because of the access, retention, use, and
sharing of biometric information without individual consent or knowledge), the widening of federal government surveillance (the NGI database
will hold information that can be used to track individuals long into the future), and the increased risk of errors and vulnerability to hackers and
identity thieves. Federal agencies don’t like to admit that they make mistakes, but we know it happens. Take for
example Mark Lyttle, a United States citizen who was mistakenly deported and sent to five different countries in four
months after a criminal corrections administrator erroneously typed “Mexico” as his place of birth. Or U.S.-born Brandon
Mayfield, who was wrongly accused of perpetrating the 2004 Madrid train bombing and was held in police custody for two weeks based on an
alleged match between his fingerprints and latent prints from the crime scene, a match that was later deemed inaccurate. These types of
mistakes are even more likely to occur as the FBI relies upon new, questionable physical-trait scanning technologies.
One recent study found that when used among large populations, facial recognition will inevitably lead to
misidentifications because of the lack of variation across individuals’ faces. Indeed, John Gass of Massachusetts was the victim
of facial recognition technology errors when his driver’s license was revoked based on a facial recognition system that determined his authentic
and legitimate license was fake. These disturbing examples will become only more frequent and have more serious
consequences as the database grows and more federal agencies and foreign governments join the “FBI club.” The
rapid and massive expansion of NGI’s collection, storage, and sharing capabilities is moving us closer and closer to
the type of pervasive surveillance referenced by Justice Breyer. We can only wonder, is Orwell’s Thought Police
next? An annotated index of newly released FOIA documents related to NGI and the FBI’s role in Secure Communities,
along with the documents, is available at: http://uncoverthetruth.org/?p=2058.To read previously released documents related
to NGI and a related fact sheet, go towww.uncoverthetruth.org/foia-ngi/ngi-documents. To learn more about Secure
Communities and how you can prevent its implementation in your community or state, visit www.uncoverthetruth.org. For
more information about the case NDLON v. ICE brought by CCR, the National Day Laborer Organizing Network and the
Benjamin Cardozo Immigrant Justice Clinic, visit CCR’s case page.
13
MDAW 15
Biometrics
Harms Ext: Privacy Rights (2/2)
Biometrics creates a constant Panoptical surveillance structure that limits privacy and
freedom
Christopher Slobogin, Professor, Law, University of Florida, “Public Privacy: Camera Surveillance of Public Places and the Right to
Anonymity,” MISSISSIPPI LAW JOURNAL v. 72, 2002, http://www.olemiss.edu/depts/ncjrl/pdf/LJournal02Slobog.pdf
In addition to its effect on behavior, CCTV might trigger a number of unsettling emotional consequences. Relying on the work of Erving Goffman,
Jeffrey Rosen notes that “it's considered rude to stare at strangers whom you encounter in public.”141 Staring, whether it occurs on an elevator,
on public transportation or on the street, violates the rules of “civil inattention.”142 The cyclopian gaze of the camera eye may be equally
disquieting, and perhaps more so given the anonymity of the viewer and the unavailability of normal countermeasures, such as staring back or
requesting the starer to stop. The small amount of social science research specifically aimed at assessing the impact of
concerted surveillance tends to verify that these and other psychological and behavioral effects can occur. For
instance, empirical investigations of the workplace–one of the contexts Foucault thought might benefit from
panopticism–indicate that even there surveillance has a downside. Monitored employees are likely to feel less trusted,
less motivated, less loyal, and more stressed than employees who are not subject to surveillance.1 Whether
these
findings would be duplicated in the public surveillance context is not clear.144 But one could plausibly infer
from them that citizens on the street who are subject to camera surveillance might experience less confidence in their
overall freedom to act, as well as somewhat diminished loyalty to a government that must watch its citizens' every public movement. Roger
Clarke also calls attention to the latter possibility in his study of the effects of widespread surveillance. Among the many consequences of
“dataveillance,” as he calls it, are a prevailing climate of suspicion, an increase in adversarial relationships between citizens and government,
and an increased tendency to opt out of the official level of society.145 To capture the core of these disparate observations, consider again
Rehnquist's bar example. The people entering the bar will feel less trusted, and more anxious, and may even stop going there. Or try another
simple thought experiment. Virtually all of us, no matter how innocent, feel somewhat unnerved when a police car pulls
up behind us. Imagine now being watched by an officer, at a discreet distance and without any other intrusion, every
time you walk through certain streets. Say you want to run (to catch a bus, for a brief bit of exercise or just for the hell of it). Will you? Or
assume you want to obscure your face (because of the wind or a desire to avoid being seen by an officious acquaintance)? How about hanging
out on the street corner (waiting for friends or because you have nothing else to do)? In all of these scenarios, you will probably feel and perhaps
act differently than when the officer is not there. Perhaps your hesitancy comes from uncertainty as to the officer's likely reaction or simply from
a desire to appear completely law-abiding; the important point is that it exists. Government-run cameras are a less tangible presence than the
ubiquitous cop, but better at recording your actions. A police officer in Liverpool, England may have said it best: A camera is like having a
cop “on duty 24 hours a day, constantly taking notes .”14
14
MDAW 15
Biometrics
Harms Ext: Race / Sorting
Facial recognition technology raises enormous equity concerns
Lucas D. Introna, Lancaster University and Helen Nissenbaum, New York University, FACIAL RECOGNITION TECHNOLOGY: A SURVEY
OF POLICY AND IMPLEMENTATION ISSUES, Center for Catastrophe Preparedness and Response, New York University, 2009, p. 45.
The question of fairness is whether the risks of FRS are borne disproportionately by, or the benefits flow
disproportionately to, any individual subjects, or groups of subjects. For example, in the evaluations discussed above,
noting that certain systems achieve systematically higher recognition rates for certain groups over others—older
people over youth and Asians, African-Americans, and other racial minorities over whites—raises the politically
charged suggestion that such systems do not belong in societies with aspirations of egalitarianism. If, as a result of
performance biases, historically affected racial groups are subjected to disproportionate scrutiny, particularly if
thresholds are set so as to generate high rates of false positives, we are confronted with racial bias similar to
problematic practices such as racial profiling. Beyond thorny political questions raised by the unfair distribution of false positives, there
is the philosophically intriguing question of a system that manages disproportionately to apprehend (and punish) guilty parties from one race,
ethnicity, gender, or age bracket over others. This question deserves more attention than we are able to offer here but worth marking for future
discussion.
Biometric data targets minorities
Margaret Hu Fall, 2013 Indiana Law Journal 88 Ind. L.J. 1475 Biometric ID Cybersurveillance Lexis Nexis.
In biometric-verification technology, accuracy improves if all other factors remain stable in the environment. For example, NIST has learned
through PIV card/biometric ID card implementation that the same vendor should be used to ensure higher accuracy. Biometric technology users
are instructed to attempt to ensure that the environment for the biometric data enrollment and the verification are identical (e.g., attempt to use
same staff, same room, same lighting, and same humidity levels). In addition, experts have realized that biometric verification systems need to
develop an alternative system for people with no fingerprints, those with "damaged" fingerprints, dysplasia resulting in no lines in fingerprints,
and so forth. Biometric research has determined that biometric data is less accurate and harder to recognize for women
(fine skin and less defined fingerprints due to housecleaning solution and face cleansing) and the elderly (loss of
collagen). [*1541]
Biometric research has also determined that the statistical algorithms have a racially disparate
impact in accuracy for reasons that are not fully understood. Finally, biometric technology has not yet adopted a uniform standard
domestically or internationally. Some advocate adoption of the INTERPOL fingerprinting standard, which is similar to the American standard
(e.g., using image and points within fingerprint). This matter, however, remains unresolved.
Surveillance tech risks being used to engage in discriminatory targeting
Christopher S. Milligan, J.D. Candidate, “Facial Recognition Technology, Video Surveillance, and Privacy,” SOUTHERN CALIFORNIA
INTERDISCIPLINARY LAW JOURNAL v. 9, Winter 1999, p. 328.
""Once the new surveillance systems become institutionalized and taken for granted in a democratic society,' they can
be "used against those with the "wrong' political beliefs; against racial, ethnic, or religious minorities; and against
those with lifestyles that offend the majority.'" For example, in Miami Beach, the use of public video surveillance for law enforcement
purposes was started only after lower-income black and Hispanic citizens began residing in the com-munity of elderly retirees. Social
psychologists have also determined that videotaping participants in political activities affects their sense of self,
because there is a tendency to unconsciously associate being surveilled with criminality. Individuals may be less
willing to become involved in political activities if they know that they will be subject to video surveillance, and
possibly arrest. In this manner, the use of video surveillance can be used to obstruct the political speech and
associational rights of individuals and groups that engage in political activities.
The information is at risk of being abused—data creep
Christopher S. Milligan, J.D. Candidate, “Facial Recognition Technology, Video Surveillance, and Privacy,” SOUTHERN CALIFORNIA
INTERDISCIPLINARY LAW JOURNAL v. 9, Winter 1999, p. 329.
The term "data creep" describes the way in which information given and used for one purpose, such as public records,
tends to be disseminated - from the governmental agency that collects the information to a company that compiles
databases of information, and then finally to the consumers of this information. There are fears that information
collected through video surveillance and facial recognition technology (combined with other potential sources of data)
will be shared among governmental agencies and outside entities. There are also concerns about the inappropriate use
of video surveillance and facial recognition software by those who are doing the monitoring . There have been numerous
incidents of police observers being disciplined for directing the gaze of video cameras towards inappropriate areas of civilians' anatomy or
through bedroom windows.
15
MDAW 15
Biometrics
Solvency Ext: Congress
FRT and other biometric technologies are not simply an extension of existing
surveillance—they are radically different and more troubling—congress needs to act
Laura K. Donohue, Associate Professor, Law, Georgetown University, “Technological Leap, Statutory Gap, and Constitutional Abyss:
Remote Biometric Identification Comes of Age,” MINNESOTA LAW REVIEW v. 97, 12—12, p. 558.
One of Cohen's most important insights, and one shared by other constructive theorists, is that privacy plays a more central role in
human experience than liberal political theory acknowledges. Boundary management, which gives breathing space for
subjectivity - and critical subjectivity in particular - depends upon privacy not as an individual right, but as a social
good. Cohen notes, "A society that wishes to foster critical subjectivity must cabin the information-al and spatial logics of
surveillance." Other norms, such as mobility, access to knowledge, and discontinuity, may prove equally important in
development of the self and society. There is a broader danger in reducing the self to binary code. At a minimum, much
more work on this front, specifically in regard to how we think about privacy as a constitutive principle in regard to
information recording, access and management - and particularly as it relates to new identification technologies needs to occur. This approach rests not on simply adapting the existing frameworks, but on re-conceiving the place of
privacy for self and society. What makes this inquiry so pressing is that the federal government, to date, has been so
eager to take advantage of the new technologies that constitute RBI. At one level, this makes a great deal of sense. To
the extent that technology makes officials more efficient, utilizes resources more effectively, and helps to accomplish
the aims of government agencies, strong support would naturally follow . This is a rationale adopted by all three branches of
government, as illustrated by, e.g., legislative directives to the executive branch to move swiftly to explore biometric technologies (Part I, above),
initiatives taken by the Executive branch post-9/11 to develop new systems (Part I, above), and judicial decisions that rest upon the assumption
that the new technologies merely do what a single police officer could do by tailing and individual - but more efficiently (Part III, above). The
problem with this approach is that the underlying assumption is wrong. This technology is not simply more efficient. It
is different in kind - not degree - to what has come before. These are not just incremental changes to the status quo,
which ought to be treated in a manner consistent with traditional framings. Cameras in public space capture
significantly more than the naked eye outside the curtilage of the home might learn. They record, with perfect recall,
entire contexts, which may, in conjunction with other biometric and biographic data, reveal new insights into citizens'
lives and social networks. The kind of surveillance in question, the length of the surveillance, and the radical reduction
in resource limitations all differ. It is time for Congress - and the courts - to recognize this new form of surveillance.
Towards these ends, I have proposed five guidelines to distinguish RBI technologies from those more common in
immediate biometric identification. Specifically, RBI allows the government to ascertain the identity (1) of multiple
people; (2) at a distance; (3) in public space; (4) absent notice and consent; and (5) in a continuous and on-going
manner. The stakes could not be higher for subjecting technologies that fall into this category to more rigorous
scrutiny. For what we now face are questions about human development, social construction, and the role of government in the lives of
citizens - questions that go well beyond individual rights.
16
MDAW 15
Biometrics
DA Ans: Terrorism
Biometric surveillance is ineffective.
Dustin Volz, journalist, “FBI's Facial Recognition Software Could Fail 20 Percent of the Time,” NATIONAL JOURNAL, 10—14—13,
www.nationaljournal.com/daily/fbi-s-facial-recognition-software-could-fail-20-percent-of-the-time-20131014
The Federal Bureau of Investigation's facial-recognition technology, used to identify individuals and assist in investigations,
could fail one in every five times it is used new documents show. A 2010 report recently made public by the Electronic Privacy
Information Center through a Freedom of Information Act request states that the facial-recognition technology "shall return an
incorrect candidate a maximum of 20% of the time." When the technology is used against a searchable repository, it "shall return the correct
candidate a minimum of 85% of the time." "An innocent person may become part of an investigation because the technology
isn't completely accurate," said Jeramie Scott, an attorney with EPIC who reviewed the documents, citing the Boston Marathon bombings as an
example.
Other methods of surveillance trump biometrics
Dustin Volz, journalist, “FBI's Facial Recognition Software Could Fail 20 Percent of the Time,” NATIONAL JOURNAL, 10—14—13,
www.nationaljournal.com/daily/fbi-s-facial-recognition-software-could-fail-20-percent-of-the-time-20131014
states that the facial-recognition technology "shall return an incorrect candidate a maximum of 20% of the time." When
the technology is used against a searchable repository, it "shall return the correct candidate a minimum of 85% of the time." "An innocent person
may become part of an investigation because the technology isn't completely accurate ," said Jeramie Scott, an attorney with EPIC
who reviewed the documents, citing the Boston Marathon bombings as an example. "They're pushing it forward even though the technology isn't ready for
prime time." FBI officials could not be reached for comment, perhaps owing to the government shutdown. The numbers posted by facialrecognition software compare unfavorably with other identification techniques used as part of the bureau's Next Generation
Identification program, including fingerprinting, which yields an accurate match 99 percent of the time when combing a
database, and iris scans, which field correct matches 98 percent of the time. Currently, no federal laws limit the use of facialrecognition software by private-sector companies
Biometrics cannot effectively exclude dangerous people—too many technical barriers
Torin Monahan, Assistant Professor, Justice and Social Inquiry, Arizona State University, “Questioning Surveillance and Security,”
SURVEILLANCE AND SOCIETY: TECHNOLOGICAL POLITICS AND POWER IN EVERYDAY LIFE ed. T.Monahan, 20 06, New York:
Routledge, p. 7.
But do biometrics work for the purpose of locating and stopping terrorists? According to the U.S. General Accounting
Office, although “the desired benefit is the prevention of the entry of travelers who are inadmissible to the United States”
(Kingsbury 2003: 6), or “keeping the bad guys out” in President George W. Bush’s parlance, the challenges to the success of
biometric systems are manifold. Obstacles include labor increases, travel delays, tourism reduction, inadequate
training, grandfathering arrangements, reciprocal requirements from other countries, exemptions, false IDs,
“significant” costs, and circumvention of border systems by more than 350,000 illegal entries a year (U.S. Citizenship and
Immigration Services 2002). In addition, more technical obstacles include managing a massive database of up to 240 million
records and maintaining accurate “watch lists” for suspected terrorists.
Biometric systems cannot stop terrorism—may even lull us into complacency
Torin Monahan, Assistant Professor, Justice and Social Inquiry, Arizona State University, “Questioning Surveillance and Security,”
SURVEILLANCE AND SOCIETY: TECHNOLOGICAL POLITICS AND POWER IN EVERYDAY LIFE ed. T.Monahan, 20 06, New York:
Routledge, p. 8.
A recent report by Privacy International is forceful in its denunciation of biometrics and national identity cards. he
report argues that because no evidence exists that these systems can or do prevent terrorism, any link between these
systems and antiterrorism is merely rhetorical: Of the 25 countries that have been most adversely affected by terrorism since 1986, eighty per cent
have national identity cards, one third of which incorporate biometrics. This research was unable to uncover any instance where the presence of an identity card
system in those countries was seen as a significant deterrent to terrorist activity. Almost two thirds of known terrorists operate under their true identity … It is possible
that the existence of a high integrity identity card would provide a measure of improved legitimacy for these people. (Privacy International 2004a: 2) Thus, not only
might biometric systems fail to perform their intended functions, they might have the opposite effect of deflecting inquiry away from terrorists who possess valid hightech biometric IDs. This point should give policy makers pause, because all of the 9/11 attackers entered the United States legally with the requisite visas (Seghetti
2002). Finally, even with completely operational biometric and national ID systems in place, there are numerous ways to circumvent them, for instance, by pretending
to be an “outlier” (or a person unable to provide accurate biometric data), acquiring a false identity, escaping watch lists (by providing false information or by virtue of
being a “new recruit”), or spoofing identity (for instance, by using custom-made contact lenses to fool iris scanners) (Privacy International 2004a: 7–8).
Regardless of the cost or complexity of implementing and harmonizing biometric systems across countries, it is clear
that they can never be foolproof, and it is questionable whether they would even diminish threats (see Van der Ploeg [Chapter
11, this volume] for a detailed inquiry into the social effects of some of these systems along borders).
17
MDAW 15
Biometrics
Neg: Terror DA Link—1NC
FRT is necessary to protect our infrastructure against terrorist attacks
Roberto Iraola, Senior Advisor to the Deputy Assistant Secretary for Law Enforcement and Secrity, Department of the Interior, “Lights,
Camera, Action!—Surveillance Cameras, Facial Recognition Systems and the Constitution,” LOYOLA LAW REVIEW v. 49, Winter 20 03, p.
806-807.
The events of September 11 demonstrate that terrorists have no reservation about engaging in direct attacks on Ameri-ca's
mainland. As recognized by President Bush in his National Strategy for the Physical Protection of Critical Infra-structures and Key Assets:
Protecting America's critical infrastructures and key assets represents an enormous challenge. Our Nation's critical in-frastructures and
key assets are a highly complex, heterogeneous, and interdependent mix of facilities, systems, and functions that are
vulnerable to a wide variety of threats. Their sheer numbers, pervasiveness, and interconnected nature create an
almost infinite array of high-payoff targets for terrorist exploitation. Given the immense size and scope of the potential
target set, we cannot assume that we will be [*807] able to protect completely all things at all times against all
conceivable threats. n156 While not a panacea, given the reality of the terrorist threat, and depending on the critical
infrastructure or key asset at issue, n157 the use in public places of surveillance cameras equipped with facial
recognition technology to help identify suspected terrorists is a reasonable protective measure which serves a critical
government interest. n158 More generally, surveillance camera systems alone may be used in public places to deter crime and identify
criminals.
18
MDAW 15
Biometrics
Neg: Terror DA Link—2NC
FRT will help us fight terrorism—need to balance that against civil liberties concerns
Amerson, Sheriff, Calhoun County, Alabama, Testimony before the Senate Judiciary Committee, Subcommittee on Privacy,
Technology and the Law, 7—18—12, www.judiciary.senate.gov/imo/media/doc/12-7-18AmersonTestimony.pdf.
Larry
On the other hand, advances in technology, and especially facial recognition, which has already been implemented in law
enforcement, national defense and the fight against terrorism, are a critical tool in protecting the rights of citizens, in
ensuring the accurate identification of suspects, prisoners and potential terrorists is almost immediately ascertained,
while protecting the safety of our citizens and law enforcement officers. There is a critical balance between protecting
the rights of law abiding citizens and providing law enforcement agencies with the most advanced tools to combat crime, properly identify
suspects, catalog those incarcerated in prisons and jails, and in defending America from acts of terrorism.
FRT will make it easier to identify suspects—aids significantly in investigations
Larry
Amerson, Sheriff, Calhoun County, Alabama, Testimony before the Senate Judiciary Committee, Subcommittee on Privacy,
Technology and the Law, 7—18—12, www.judiciary.senate.gov/imo/media/doc/12-7-18AmersonTestimony.pdf.
Most importantly, advances in facial recognition technology over the last 10 years will result in the end of the total reliance
on fingerprinting, where it takes hours and days to identify a suspect, fugitive or person being booked into a jail, to the immediate
identification of those known to have criminal records, or who are wanted by law enforcement. It will surprise many in the room today to know
that there is no national database of those incarcerated in America’s jails at any one time. The use of facial recognition to provide
instant identification of those incarcerated or under arrest will eliminate many problems while protecting innocent
civilians and law enforcement officers. For instance, utilizing facial recognition in law enforcement would: • Interconnect law
enforcement and Intel organizations to instantly share vital information with accurate identification results. • Establish a national database of
those incarcerated present and past, fugitives, wanted felons, and persons of interest among all law enforcement agencies. • Allow officers to
quickly determine who they are encountering and provide notification if a suspect is wanted or a convicted felon. • A simple, cost effective,
software based solution delivered in Windows based computers with inexpensive non-proprietary off the shelf cameras provide a huge cost
savings. • Demonstrate new capabilities in alias detection, fugitive apprehension, and speed of suspect recognition • Ensure correct
identification prisoners being released and reduce costs associated with conducted administrative procedures. • Establish a complete national
database of incarcerated persons in for the first time in U. S. history no longer could wanted criminals escape detection and arrest due to
inefficient processes. While fingerprints take hours and days for analysis, some advanced facial recognition in use today
by U.S. law enforcement, is as accurate as fingerprints but results are obtained in seconds not hours in identifying
criminals and perpetrators attempting to use false identities and aliases.
19
MDAW 15
Biometrics
Neg: Abuse Answers
NGI is protected—access is only available to law enforcement, and safeguards are in
place
Jerome M. Pender, Deputy Assistant Director, Criminal Justice Information Services Division, Federal Bureau of Investigation, Testimony
before the Senate Judiciary Committee, Subcommittee on Privacy, Technology and the Law, 7—18—12,
www.judiciary.senate.gov/imo/media/doc/12-7-18PenderTestimony.pdf.
Searches of the national repository of mug shots are subject to all rules regarding access to FBI CJIS systems
information (28 U.S.C. § 534, the FBI security framework, and the CJIS Security Policy) and subject to dissemination rules for
authorized criminal justice agencies. Queries submitted for search against the national repository must be from
authorized criminal justice agencies for criminal justice purposes. Each participating pilot state or agency is required
to execute a Memorandum of Understanding (MOU) that details the purpose, authority, scope, disclosure and use of
information, and the security rules and procedures associated with piloting. Pilot participants are advised that all
information is treated as “law enforcement sensitive” and protected from unauthorized disclosure. Pilot participants
are informed that Information derived from pilot search requests and resulting responses is to be used only as an
investigative lead. Results are not to be considered as positive identifications.
20
MDAW 15
Biometrics
Neg: Privacy Answers
FRT does not run afoul of the Fourth Amendment—it is not a search of property
Nita Farahany, Professor, Law, Duke University, Testimony before the Senate Judiciary Committee, Subcommittee on Privacy, Technology,
and the Law, 7—18—12, www.judiciary.senate.gov/imo/media/doc/12-7-18FarahanyTestimony.pdf.
Real property law informs whether an individual has a reasonable expectation of privacy in secluding—i.e. restricting
access to others—the property searched. Facial recognition technology does not interfere with a cognizable Fourth
Amendment privacy interest, such as interference with real or intellectual property rights, nor does it intrude upon
personal security or movement. As such, there is no source of law upon which a reasonable expectation of privacy to object to facial
recognition scanning could be grounded. Concepts of possession and property are at the core of the Fourth Amendment, as its possessive
pronoun makes clear: “the right of the people to be secure in their persons, houses, papers, and effects.” And so, from the beginning, the Court
has turned to property law to inform Fourth Amendment interests. When the Court first encountered the modern investigative technique of
wiretapping, for example, which like facial recognition enables investigators to obtain evidence without any physical interference with one’s
property, the Court found that no search had occurred because conversations are not tangible property or “material things” that the Fourth
Amendment protects. Likewise, facial recognition technology does not implicate any property interests.
FRT is permissible under the Fourth Amendment—we have no reasonable expectation of
privacy when we are in public / not secluded
Nita Farahany, Professor, Law, Duke University, Testimony before the Senate Judiciary Committee, Subcommittee on Privacy, Technology,
and the Law, 7—18—12, www.judiciary.senate.gov/imo/media/doc/12-7-18FarahanyTestimony.pdf.
Even with this expanded view of individual interests, however, an individual who is scanned in public cannot
reasonably claim that facial recognition technology captures something he has sought to seclude from public view.
Instead, he must argue that he has a reasonable expectation of privacy in his personal identity associated with his
facial features. Under current doctrine, courts would properly reject such a claim . Despite the shift in Katz from purely propertybased privacy protections to seclusion more generally, the Court has not recognized an independent privacy interest in the secrecy of identifying
information per se. Consequently, it is the physical intrusiveness of facial recognition technology, and not the extent to which it reveals
personally identifying information, that will determine its reasonableness in a Fourth Amendment inquiry. 38 And because the technology is
physically unobtrusive and does not reveal information that is secluded or otherwise hidden from public view, it is not
properly characterized as a Fourth Amendment search.
21
MDAW 15
Biometrics
Neg: Private Sources Biggest Threat
Private industry practices pose substantial risks to privacy
Anne T. McKenna, attorney, “Pass Parallel Privacy Standards of Privacy Perishes,” RUTGERS LAW REVIEW v. 65, Summer 20 13, p.
1075-1076.
Private industry's unfettered collection, use, and sale of citizens' personal data does pose risks. n255 One of these risks
is that information collected will be used for purposes other than those expected by or disclosed to the consumer at
the time of collection. n256 This privacy wrong has broad implications, because consumers currently have no choice about the gathering of
their own information or its use. n257 Many people unknowingly supply data when a software app gathers it surreptitiously, or
they supply information through the Internet or their smartphone for convenience. n258 In the latter case, they do so because
their choice is either give information or be precluded from the use of a helpful or popular application. n259 Convenience often outweighs
thoughts of privacy, yet when the information given is used for a purpose different from that of the application for which it was provided, the
consumer has been wronged. n260 Some newer technologies remove consumer choice entirely from the equation. n261 Face recognition
technology, for example, is used in public and automatically captures one's image without consent, and in most cases, without knowledge that
the data capture (and possible subsequent identification) has even happened. n262 A consumer cannot expect to have any
semblance of control over their information and their identification if they do not realize their image has been taken or
if the image was taken without consent.
Consumers are unaware of the quantity of information provided to and gathered by third parties through smartphones
and Internet activity. n263 For example, by combining a consumer's likes and other [*1076] personal information with
face recognition technology, a previously anonymous person in public could be identified and then targeted with
specific marketing based upon the combination of that identification with the already-collected personal information.
The biggest privacy risks raised by FRT come from private sources
Bridget Mallon, “’Every Breath You Take, Every More You Make, I’ll Be Watching You’: The Use of Factial Recognition Technology,”
VILLANOVA LAW REVIEW v. 48, 2003, p. 979-981.
The true privacy problems arise from third-party use of face recognition technology. n138 The biggest concern
stemming from third party use is [*980] the potential for private citizens to develop and maintain vast amounts of
information on individuals. n139 With the new technology being made available to the public, there is the possibility
that businesses across the country will install surveillance cameras to scan the faces of their customers and
employees. n140 There is also a fear that businesses will begin to develop data files on their customers and employees,
and then share these files with other businesses. n141 The result is that businesses could track customer purchases or
the whereabouts of their employees and every time individuals enter a store, their faces can call up their entire data
file. n142 Personal information is meant to remain private. Thus, the fact that technology is giving private individuals the power to recall
personal information with a simple photograph raises concerns over the need to regulate this new tech-nology. n143 While the Constitution may
place limits on the government's use of this technology, there is no equivalent that regulates its use by private citizens. n144 Currently, there are
no state or federal [*981] laws that regulate the use of face recognition technology and laws that do control the use of an individual's private
information are ill-equipped to handle this new and changing technology. n145 Without restriction, there is the potential for private use of face
recognition technology to cross the boundary from providing security to invading privacy. n146 As it stands now, the use of face recognition
technology does not violate the protection afforded by the Constitution. Nevertheless, there still remains an unlimited amount of danger that this
technology can pose. n147
22
Download