overload core file - University of Michigan Debate Camp Wiki

advertisement
OVERLOAD CORE – UMich 7wk
MAGS
Gabi Yamout
Ryan Powell
Isaiah Sirois
Ammar Plumber
Derek Tang
Aff
A2: terror da
Yes Overload
Experts agree – squo surveillance is counterproductive and wastes
money
Ward 15 – staff writer (Stan, “NSA swamped with data overload also trashes the Constitution,”
Best VPN, 5/18/2015, https://www.bestvpn.com/blog/19187/nsa-swamped-with-data-overloadalso-trashes-the-constitution/) //RGP
Almost on the second anniversary of the Edward Snowden revelations, another (in)famous
NSA
whistleblower has again spoken up. This comes at a pivotal juncture in the legislative calendar as
contentious debate about surveillance rages over the impending sunset of some of the Patriot Act. It has long been
an argument of the civil liberties crowd that bulk data gathering was counter-productive, if
not counter- intuitive. The argument was couched in language suggesting that to “collect it all”, as the then NSA
director James Clapper famously decried, was to, in effect, gather nothing, as the
choking amounts of information collected would be so great as to be unable
to be analyzed effectively. This assertion is supported by William Binney, a founder of Contrast Security and a
former NSA official, logging more than three decades at the agency. In alluding to what he termed “bulk
data failure”, Binney said that an analyst today can run one simple query across
the NSA’s various databases, only to become immediately overloaded with
information. With about four billion people (around two-thirds of the world’s population) under the NSA and
partner agencies’ watchful eyes, according to his estimates, there is far too much data being collected.
“That’s why they couldn’t stop the Boston bombing, or the Paris shootings,
because the data was all there… The data was all there… the NSA is great at going back over it forensically
for years to see what they were doing before that. But that doesn’t stop it.” Binney is in a position to
know, earning his stripes during the terrorism build up that culminated with the
9/11 World Trade Center bombing in 2001. He left just days after the draconian legislation known as
the USA Patriot Act was enacted by Congress on the heels of that attack. One of the reasons which prompted his leaving
was the scrapping of a surveillance system on which he long worked, only to be replaced by more intrusive systems. It is
interesting to note here that Edward Snowden, in alluding to Binney, said he was inspired by Binney’s plight, and that this,
in part, prodded him to leak thousands of classified documents to journalists. Little did Binney know that his work was to
be but the tip of the iceberg in a program that eventually grew to indiscriminately “collect it all.” What is worrisome is the
complicity with the bulk data collection by dozens of private companies – maybe as many as 72. Yet this
type of
collection pales in comparison to that of the “Upstream” program in which the
NSA tapped into undersea fiber optic cables. With the cooperation of Britain’s GCHQ, the NSA is
able to sift more than 21 petabytes a day. Gathering such enormous amounts of information is
expensive and ineffective, according to Binney. But it gets lawmakers attention in
a way that results in massive increases in NSA budgets. Binney warns that, “They’re taking
away half of the Constitution in secret.” President Obama has presided over this agency’s land grab, and has endorsed it,
often to referring to Upstream as a “critical national security tool.” His feckless approach to the spying build up is the
reason for its proliferation, and is why Congress meanders rudderless in attempts to curtail it. The President’s anti-privacy
stance is being “rewarded” by repudiation among members of his own party, and is reflected in their rejecting his latest
legacy-building, pet piece of legislation – the Trans Pacific Partnership (TPP). But their constituents would be better
served by producing legislation that would restore Constitutional rights trampled on by the NSA.
Bulk data collection fails – it saps critical resources and diverts
attention
Maass 15 – Journalist for The Intercept (Peter, “INSIDE NSA, OFFICIALS PRIVATELY
CRITICIZE “COLLECT IT ALL” SURVEILLANCE,” The Intercept, 5/28/2015,
https://firstlook.org/theintercept/2015/05/28/nsa-officials-privately-criticize-collect-it-allsurveillance/) //RGP
AS MEMBERS OF CONGRESS struggle to agree on which surveillance programs to re-authorize before the Patriot Act
expires, they might consider the unusual advice of an intelligence analyst at the National Security Agency who warned
about the danger of collecting too much data. Imagine, the analyst wrote in a leaked document, that you are standing in a
shopping aisle trying to decide between jam, jelly or fruit spread, which size, sugar-free or not, generic or Smucker’s. It can
be paralyzing. “We
in the agency are at risk of a similar, collective paralysis in the face
of a dizzying array of choices every single day,” the analyst wrote in 2011. “’Analysis
paralysis’ isn’t only a cute rhyme. It’s the term for what happens when you spend so much
time analyzing a situation that you ultimately stymie any outcome …. It’s what happens in
SIGINT [signals intelligence] when we have access to endless possibilities, but we struggle to prioritize, narrow, and
NSA intelligence experts express
concerns usually heard from the agency’s critics: that the U.S. government’s “collect it all”
strategy can undermine the effort to fight terrorism. The documents, provided to The Intercept
exploit the best ones.” The document is one of about a dozen in which
by NSA whistleblower Edward Snowden, appear to contradict years of statements from senior officials who have claimed
that pervasive surveillance of global communications helps the government identify terrorists before they strike or quickly
find them after an attack. The Patriot Act, portions of which expire on Sunday, has been used since 2001 to
conduct a number of dragnet surveillance programs, including the bulk collection of phone metadata from American
documents suggest that analysts at the NSA have drowned in data
since 9/11, making it more difficult for them to find the real threats. The titles of the
companies. But the
documents capture their overall message: “Data Is Not Intelligence,” “The Fallacies Behind the Scenes,” “Cognitive
Overflow?” “Summit Fever” and “In Praise of Not Knowing.” Other titles include “Dealing With a ‘Tsunami’ of Intercept”
and “Overcome by Overload?” The documents are not uniform in their positions. Some acknowledge the overload problem
but say the agency is adjusting well. They do not specifically mention the Patriot Act, just the larger dilemma of cutting
through a flood of incoming data. But in
an apparent sign of the scale of the problem, the
documents confirm that the NSA even has a special category of programs that is
called “Coping With Information Overload.” The jam vs. jelly document, titled “Too Many Choices,”
started off in a colorful way but ended with a fairly stark warning: “The SIGINT mission is far too vital to unnecessarily
expand the haystacks while we search for the needles. Prioritization is key.” These
doubts are infrequently
heard from officials inside the NSA. These documents are a window into the private
thinking of mid-level officials who are almost never permitted to discuss their concerns in public. AN AMUSING
PARABLE circulated at the NSA a few years ago. Two people go to a farm and purchase a truckload of melons for a dollar
each. They then sell the melons along a busy road for the same price, a dollar. As they drive back to the farm for another
load, they realize they aren’t making a profit, so one of them suggests, “Do you think we need a bigger truck?” The parable
was written by an intelligence analyst in a document dated Jan. 23, 2012 that was titled, “Do We Need a Bigger SIGINT
Truck?” It expresses, in a lively fashion, a critique of the agency’s effort to collect what former NSA Director Keith
Alexander referred to as “the whole haystack.” The critique goes to the heart of the agency’s drive to gather as much of the
world’s communications as possible: because it may not find what it needs in a partial haystack of data, the haystack is
expanded as much as possible, on the assumption that more data will eventually yield useful information. “THE
PROBLEM IS THAT WHEN YOU COLLECT IT ALL, WHEN YOU MONITOR
EVERYONE, YOU UNDERSTAND NOTHING.” –EDWARD SNOWDEN The Snowden files show
that in practice, it doesn’t turn out that way: more is not necessarily better, and in fact, extreme
volume creates its own challenges. “Recently I tried to answer what seemed like a relatively
straightforward question about which telephony metadata collection capabilities are the most important in case we need
to shut something off when the metadata coffers get full,” wrote the intelligence analyst. “By the end of the day, I felt like
capitulating with the white flag of, ‘We need COLOSSAL data storage so we don’t have to worry about it,’ (aka we need a
bigger SIGINT truck).” The analyst added, “Without
metrics, how do we know that we have
improved something or made it worse? There’s a running joke … that we’ll only know if collection is
important by shutting it off and seeing if someone screams.” Another document, while not mentioning the dangers of
collecting too much data, expressed concerns about pursuing entrenched but unproductive programs. “How many times
have you been watching a terrible movie, only to convince yourself to stick it out to the end and find out what happens,
“This ‘gone too
far to stop now’ mentality is our built-in mechanism to help us allocate and ration
resources. However, it can work to our detriment in prioritizing and deciding which projects or efforts are worth
since you’ve already invested too much time or money to simply walk away?” the document asked.
further expenditure of resources, regardless of how much has already been ‘sunk.’ As has been said before, insanity is
doing the same thing over and over and expecting different results.” “WE
ARE DROWNING IN
INFORMATION. AND YET WE KNOW NOTHING. FOR SURE.” –NSA INTELLIGENCE
ANALYST Many of these documents were written by intelligence analysts who had regular columns distributed on
NSANet, the agency’s intranet. One of the columns was called “Signal v. Noise,” another was called “The SIGINT
Philosopher.” Two of the documents cite the academic work of Herbert Simon, who won a Nobel Prize for his pioneering
research on what’s become known as the attention economy. Simon wrote that consumers and managers have trouble
making smart choices because their exposure to more information decreases their ability to understand the information.
Both documents mention the same passage from Simon’s essay, Designing Organizations for an Information-Rich World:
“In
an information-rich world, the wealth of information means a dearth of
something else: a scarcity of whatever it is that information consumes. What
information consumes is rather obvious: it consumes the attention of its
recipients. Hence a wealth of information creates a poverty of attention and a need
to allocate that attention efficiently among the overabundance of information
sources that might consume it.” In addition to consulting Nobel-prize winning work, NSA analysts have
turned to easier literature, such as Malcolm Gladwell’s best-selling Blink: The Power of Thinking Without Thinking. The
author of a 2011 document referenced Blink and stated, “The
key to good decision making is not
knowledge. It is understanding. We are swimming in the former. We are
desperately lacking in the latter.” The author added, “Gladwell has captured one of the biggest challenges
facing SID today. Our costs associated with this information overload are not only financial, such as the need to build data
warehouses large enough to store the mountain of data that arrives at our doorstep each day, but also include the more
intangible costs of too much data to review, process, translate and report.” Alexander, the NSA director from 2005 to 2014
and chief proponent of the agency’s “collect it all” strategy, vigorously defended the bulk collection programs. “What we
have, from my perspective, is a reasonable approach on how we can defend our nation and protect our civil liberties and
privacy,” he said at a security conference in Aspen in 2013. He added, “You need the haystack to find the needle.” The
same point has been made by other officials, including James Cole, the former deputy attorney general who told a
congressional committee in 2013, “If you’re looking for the needle in the haystack, you have to have the entire haystack to
look through.” NSA Slide, May 2011 The opposing viewpoint was voiced earlier this month by Snowden, who noted in an
interview with the Guardian that the
men who committed recent terrorist attacks in France,
Canada and Australia were under surveillance—their data was in the haystack yet
they weren’t singled out. “It wasn’t the fact that we weren’t watching people or
not,” Snowden said. “It was the fact that we were watching people so much that
we did not understand what we had. The problem is that when you collect it all, when
you monitor everyone, you understand nothing.” In a 2011 interview with SIDtoday, a deputy
director in the Signals Intelligence Directorate was asked about “analytic modernization” at the agency. His response,
while positive on the NSA’s ability to surmount obstacles, noted that it faced difficulties, including the fact that some
targets use encryption and switch phone numbers to avoid detection. He pointed to volume as a particular problem. “We
live in an Information Age when we have massive reserves of information and
don’t have the capability to exploit it,” he stated. “I was told that there are 2 petabytes of data in the
SIGINT System at any given time. How much is that? That’s equal to 20 million 4-drawer filing cabinets. How many
cabinets per analyst is that? By the end of this year, we’ll have 1 terabyte of data per second coming in. You can’t crank that
through the existing processes and be effective.” The documents noted the difficulty of sifting through the ever-growing
haystack of data. For instance, a 2011 document titled “ELINT Analysts – Overcome by Overload? Help is Here with
IM&S” outlined a half dozen computer tools that “are designed to invert the paradigm where an analyst spends more time
searching for data than analyzing it.” Another document, written by an intelligence analyst in 2010, bluntly stated that “we
are drowning in information. And yet we know nothing. For sure.” The analyst went on to ask, “Anyone know just how
many tools are available at the Agency, alone? Would you know where to go to find out? Anyone ever start a new
target…without the first clue where to begin? Did you ever start a project wondering if you were the sole person in the
Intelligence Community to work this project? How would you find out?” The analyst, trying to encourage more sharing of
tips about the best ways to find data in the haystack, concluded by writing, in boldface, “Don’t let those coming behind you
suffer the way you have.” The
agency appears to be spending significant sums of money to
solve the haystack problem. The document headlined “Dealing With a ‘Tsunami’ of Intercept,” written in
2006 by three NSA officials and previously published by The Intercept, outlined a series of programs to prepare for a near
future in which the speed and volume of signals intelligence would explode “almost beyond imagination.” The document
referred to a mysterious NSA entity–the “Coping With Information Overload Office.” This appears to be related to an item
the “black budget”—$48.6
million for projects related to “Coping with Information Overload.”
in the Intelligence Community’s 2013 Budget Justification to Congress, known as
Mass surveillance is counter-productive for fighting terrorism – it
causes information overload
Gross 13 – covers technology and telecom policy in the U.S. government for the IDG News
Service, and is based in Washington, D.C. (Grant, “Critics question whether NSA data collection is
effective,” PC World, 6/25/2013, http://www.pcworld.com/article/2042976/critics-questionwhether-nsa-data-collection-is-effective.html) //RGP
The recently
revealed mass collection of phone records and other communications
by the U.S. National Security Agency may not be effective in preventing terrorism,
according to some critics. The data collection programs, as revealed by former NSA contractor Edward
Snowden, is giving government agencies information overload , critics said during the
Computers, Freedom and Privacy Conference in Washington, D.C. “In knowing a lot about a lot of
different people [the data collection] is great for that,” said Mike German, a former Federal
Bureau of Investigation special agent whose policy counsel for national security at the American Civil Liberties Union. “In
actually finding the very few bad actors that are out there, not so good.” The mass
collection of data from innocent people “won’t tell you how guilty people act,” German added. The problem with
catching terrorism suspects has never been the inability to collect information,
but to analyze the “oceans” of information collected, he said. Mass data collection is
“like trying to look for needles by building bigger haystacks ,” added Wendy
Grossman, a freelance technology writer who helped organize the conference. But Timothy Edgar, a former civil liberties
watchdog in the Obama White House and at the Office of Director of National Intelligence, partly defended the NSA
collection programs, noting that U.S. intelligence officials attribute the surveillance programs with preventing more than
50 terrorist actions. Some critics have disputed those assertions. Edgar criticized President Barack Obama’s
administration for keeping the NSA programs secret. He also said it was “ridiculous” for Obama to suggest that U.S.
residents shouldn’t be concerned about privacy because the NSA is collecting phone metadata and not the content of
phone calls. Information about who people call and when they call is sensitive, he said. But Edgar, now a visiting fellow at
the Watson Institute for International Studies at Brown University, also said that Congress, the Foreign Intelligence
Surveillance Court and internal auditors provide some oversight of the data collection programs, with more checks on data
collection in place in the U.S. than in many other countries. Analysts can query the phone records database only if they see
a connection to terrorism, he said. The U.S. has some safeguards that are “meaningful and substantive, although I’m sure
many in this room ... and maybe even me, if I think about it long enough, might think they’re not good enough,” Edgar
said. While German noted that the NSA has reported multiple instances of unauthorized access by employees to the
antiterrorism databases, Edgar defended the self-reporting. “It’s an indication of a compliance system that’s actually
meaningful and working,” he said. “If you had a compliance system that said there was no violation, there were never any
mistakes, there was never any improper targeting that took place ... that would an indication of a compliance regime that
was completely meaningless.” The
mass data collection combined with better data analysis
tools translates into an “arms race” where intelligence officials try to find new
connections with the data they collect, said Ashkan Soltani, a technology and privacy consultant. New
data analysis tools lead intelligence officials to believe they can find more links to
terrorism if they just have “enough data,” but that belief is “too much sci fi,” he
said. “This is the difficult part, if you’re saying that if we have enough data we’ll be
able to predict the future,” the ACLU’s German said.
NSA is overloaded – disproportion between analysts and data risks
surprises
SIDtoday, 11
The Signals Intelligence Directorate Today Editor, “Is There a Sustainable Ops Tempo in S2? How
Can Analysts Deal With the Flood of Collection? – An interview with [redacted] (conclusion),”
4/16/11, https://s3.amazonaws.com/s3.documentcloud.org/documents/2089125/analyticmodernization.pdf // IS
Q: 7. (U//FOUO) Various pushes for analytic modernization
have been going on for
decades at NSA, but now the issue really seems to be taking center stage. In fact, the number
one "SIGINT Goal for 2011-2015" is to "revolutionize analysis." What's different now?
A: (S//SI//REL) We live in an Information Age when we have massive reserves of
information and don't have the capability to exploit it. I was told that there are 2
petabytes of data in the SIGINT System at any given time. How much is that? That's equal to 20
million 4-drawer filing cabinets. How many cabinets per analyst is that?? By the end of this
year, we'll have 1 terabyte of data per second coming in. You can't crank that
through the existing processes and be effective.
Q: (U) ...So it's a matter of volume?
A: (S//SI//REL) Not volume alone, but also complexity. We need to piece together
the data. It's impossible to do that using traditional methods. Strong selectors like phone numbers - will become a thing of the past. It used to be that if you had
a target's number, you could follow it for most of your career. Not anymore. My
daughter doesn't even make phone calls, and many targets do the same. Also, the commercial
market demands privacy, and this will drive our targets to go encrypted, maybe into
unexploitable realms. Our nation needs us to look for patterns surrounding a particular spot
on Earth and make the connections - who can do that if not us? And we can't do it using
traditional methods.
Q: (U) Looking into the future, is there anything that especially worries you? ...An eventuality
(internal or external) that would make it hard for A&P to continue to put out quality intelligence?
A: (U//FOUO) I'm worried that we have so
much good stuff that we could lock
down analysts and have them just producing product, and something would
jump out and surprise us. So we need the discipline to invest in the wild and the
unknowns.
Analyst improvement is key to check overload
SID Reporting Board, 7
Signals Intelligence Directorate Reporting Board, part of the largest functional directorate in the
NSA, “Data Is Not Intelligence,” 09/18/07,
https://s3.amazonaws.com/s3.documentcloud.org/documents/2088973/data-is-notintelligence.pdf // IS
(U) ’ Data
Is Not Intelligence’
FROM: [Redacted]
SID Reporting Board (S12R) Run Date: 09/18/2007
(U//FOUO) These words came from Dr. Thomas Fingar (pictured) in his keynote
address at the Analytic Transformation Symposium in Chicago on 5 September. Such
a strong reminder at the opening of his address was intended to remind those at the symposium
of the importance he and the Director of National Intelligence, the Honorable J. Michael
McConnell, place on improving analysis throughout the Intelligence Community.
(U//FOUO) Dr. Fingar, the Deputy Director of National Intelligence for Analysis, made this
statement at the opening of the symposium sponsored by the Intelligence and National Security
Alliance, a non-profit, non-partisan public policy forum focusing on intelligence and national
security issues. The symposium was held in Chicago, Illinois, from 4 to 6 September 2007.
(U//FOUO) Dr. Fingar
continued by saying that "intelligence comes from the
brains of analysts."
He clearly wanted those attending the symposium to understand his view of the importance of the
analytic process in producing intelligence. The emphasis throughout his remarks was
that the Intelligence Community must transform its analytic mission. The
transformation is being effected in three areas: enhancing the quality of analytic
products; managing the mission more effectively at a Community level; and building
more integrated analytic operations across the Intelligence Community.
(U//FOUO) To enhance the quality of analytic products, analysts themselves must
improve. They can do this by receiving more and better formal training, and by
continuing to learn through experience and mentoring from more experienced analysts.
In addition, they must alter mindsets that keep them from sharing information, especially that
which would improve an intelligence product. An adjunct to changing mindsets about sharing
information is establishing trust between and among analysts as a way to improve the quality of
analytic products.
(U//FOUO) In an explanation of how to manage the analytic mission more effectively at the
Community level, Dr. Fingar reviewed the A-Space and Library of National Intelligence (LNI)
programs. While some leaders might consider these two programs more as tools, Dr. Fingar
stressed that they were programs to help analysts enhance products. A-Space will provide a
virtual environment in which analysts can work on data and collaborate. The LNI will give
analysts a research facility that will help them gather already-disseminated intelligence on a topic.
(U//FOUO) The effort to build more integrated
analytic operations involves, in
part, greatiy improving collaboration. Setting common standards is a key to
collaboration, and collaboration will enhance the quality of analytic products, according to Dr.
Fingar. He emphasized that the IC analytic standards recently approved were a step, but only a
step. He called for "transparency" in intelligence analysis; that is, that all analysis has to be
reproducible. Following established common standards will help ensure transparency. More
importantly, collaboration will help establish an analytic community.
(U//FOUO) Dr. Fingar's address set the tone for the rest of the symposium. The point was that
the quality of intelligence products must improve--must "transform." The most
important part in the transformation is the analyst. In training analysts better, by
encouraging them to learn continually through experience and mentoring, product
will improve. More effective management, through programs such as A-Space and the
LNI, will help give analysts data and intelligence they need, and a better environment
in which to work. Collaboration is encouraged and made easier by these programs ,
and collaboration is part of building integrated operations. All of these together will help
ensure that the quality of analytic products improves-that customers receive
intelligence, not data .
Data risks errors which have immediate and larger impacts – our
timeframe is nanoseconds
Zoldan, 13
Ari Zoldan is an entrepreneur in the technology industry and business analyst based primarily in
New York City and Washington, D.C. “More Data, More Problems: is Big Data Always Right?”
Wired, May 2013, http://www.wired.com/2013/05/more-data-more-problems-is-big-dataalways-right/ // IS
Which leads us to our second problem: the sheer amount of data! No wonder we are
more prone to “signal error” and “confirmation bias.” Signal error is when large
gaps of data have been overlooked by analysts. If places like Coney Island and Rockaway
were overlooked in Hurricane Sandy, like they were in the Twitter study, we could be looking at a
higher death toll today. Confirmation bias is the phenomenon that people will search
within the data to confirm their own preexisting viewpoint, and disregard the
data that goes against their previously held position. In other words, you will find what
you seek out. What if FEMA looked at the Twitter data with a preexisting belief that the worst hit
part of the Tri-state area was Manhattan? They may have allocated their resources in places that
didn’t need it the most. The third problem is best described by Marcia Richards Suelzer, senior
analyst at Wolters Kluwer. She says, “We can now make catastrophic miscalculations in
nanoseconds and broadcast them universally. We have lost the balance in ‘lag
time.'” Simply put, when we botch the facts, our ability to create damage is
greatly magnified because of our enhanced technology, global
interconnectivity, and huge data sizes.
Upstream controls the bulk of data – whistleblowers confirm
Whittaker, 15
Zack Whittaker is a writer-editor for ZDNet, and sister sites CNET and CBS News, citing an NSA
whistleblower, “NSA is so overwhelmed with data, it's no longer effective, says whistleblower,”
ZDNet, 4/30/15, http://www.zdnet.com/article/nsa-whistleblower-overwhelmed-with-dataineffective/?tag=nl.e539&s_cid=e539&ttag=e539&ftag=TRE17cfd61 // IS
Upstream program is where the vast bulk of the information was being
collected," said Binney, talking about how the NSA tapped undersea fiber optic
cables. With help from its British counterparts at GCHQ, the NSA is able to "buffer" more
than 21 petabytes a day.
"The
Binney said the "collect it all" mantra now may be the norm, but it's
expensive and ineffective.
NSA’s spot-a-terrorist data-mining algorithms fail
Musgrave 13 (Shawn, Projects Editor at the public records intelligence site MuckRock.com,
6/8, “Does Mining Our Big Data for Terrorists Actually Make Us Any Safer?”
http://motherboard.vice.com/blog/does-mining-our-big-data-for-terrorists-actually-make-usany-safer//Tang)
Whether it's at the NSA, FBI, CIA or some more classified body we mere citizens aren't mature enough to know about,
data-mining is the belle of the intelligence ball. The power of statistical prediction to connect the dots, preemptively
identify the bad guys and thwart the next terrorist attack has been trumpeted loudly in defense of surveillance programs,
including the NSA's latest misadventure. But many
counterterrorism and statistical experts
doubt that even the most advanced spot-a-terrorist algorithms can produce
anything beyond false positives and mangled civil liberties. In his address Friday afternoon,
President Obama downplayed the recent revelations about NSA surveillance, dismissing much of the ensuing scrutiny as
“hype.” He
said that the NSA's extensive collection of phone call metadata from
Verizon, Sprint and AT&T, as well as its PRISM program to vacuum up server data from
Google, Facebook, Microsoft and other Internet service providers (Dropbox coming soon!) were both legal and
appropriately supervised. These programs “help us prevent terrorist attacks,” he said, and “on net it was
worth us doing.” Senator Diane Feinstein, standing next to Saxby Chambliss, her Republican counterpart on the Senate
Intelligence Committee, explained to the citizenry, “It's called protecting America.” As construction workers put the
finishing touches on the NSA's new data facility in Utah—it is said that it will be the largest data center in the world—
details continue to emerge that flesh out the exact shape and scope of NSA's various dragnets. As groups like the
Electronic Frontier Foundation have been warning for years, it's clear that the agency is pouring considerable resources
into collecting and parsing through vast datasets in hopes of neutralizing terrorist threats. But, as has been asked of the
TSA and DHS more widely, where's
the actual proof these programs offer more benefits
than downsides? Where are the thwarted plots to balance against the chill of privacy loss and the threats to, say,
activists and the government's political opponents? Among national security experts and data
scientists, there's considerable skepticism that NSA-style data-mining is an
appropriate tool for ferreting out security threats. As Ben Smith reported yesterday, finding the
Boston bombers relied on old fashioned police work, not troves of data. In a 2008
study, the National Research Council concluded that combing data streams for
terrorists is “neither feasible as an objective nor desirable as a goal.” In particular, the
report's authors underscore dubious data quality and high risk of false positives
as practical obstacles to mining data for signatures of terrorist behavior. “There's been
considerable interest in the intelligence community around using data to identify terrorists,” says Stephen Fienberg, a
professor of statistics and social sciences at Carnegie Mellon University, who contributed to the NRC report. “But the
specifics have always been elusive, and the claims are rarely backed up by serious empirical study.” IN A 2006
INTERVIEW WITH THE NEW YORK TIMES, AN FBI OFFICIAL JOKED THAT THE ENDLESS STREAM OF LEADS
MEANT MORE "CALLS TO PIZZA HUT” OR CONTACTING A “SCHOOL TEACHER WITH NO INDICATION THEY'VE
EVER BEEN INVOLVED IN INTERNATIONAL TERRORISM - CASE CLOSED." Fienberg insists that the rarity
of
terrorist events (and terrorists themselves) makes predicting their occurrence a
fraught crapshoot. He says that intelligence analysts lack training data – indicative
patterns of behavior drawn from observing multiple iterations of a complex event
– to verify whether their models have predictive validity. “These are very, very rare events –
terrorist events and terrorists themselves – that you're trying to predict. Clearly there are places where this kind of
predictive activity has been very successful – fraud detection in telecommunications, for example – but there we're talking
not-so-rare events.” Jeff Jonas, a data scientist at IBM and senior associate at the Center for Strategic and International
Studies, agrees, dismissing such terrorism prediction models as “civil liberty infringement engines." In a 2006 paper cowritten by Jim Harper of the Cato Institute, Jonas
asserts that sheer probability and a lack of
historical data dooms counterterrorism data-mining to a quagmire of false
positives. “Unless investigators can winnow their investigations down to data sets already known to reflect a high
incidence of actual terrorist information,” Jonas and Harper write, “the high number of false positives
will render any results essentially useless.” Ethical (not to mention constitutional) issues of wrongly
painting people as terrorists aside, Jonas and Harper suggest that chasing down so many bogus leads
only detracts from pursuing genuine ones, and thus actually hampers effective
counterterrorism. In a 2006 interview with the New York Times, an FBI official confirmed the
considerable waste and frustration of running down bogus tip-offs from the
NSA's wiretap dragnet, joking that the endless stream of leads meant more "calls
to Pizza Hut” or contacting a “school teacher with no indication they've ever been involved in international
terrorism - case closed." Given enough data and fine-tuning of algorithms, of course, other experts emphasize that false
positives can be reduced significantly, and insist that data-mining will play a key role in counterterrorism. Kim Taipale of
the Center for Advanced Studies in Science and Technology Policy testified to this effect before the Senate Judiciary
Committee in 2007, criticizing Jonas and Harper specifically for making “pseudo-technical” arguments that fail to reflect
the way actual data-mining algorithms work. NSA's Utah data center. Photo courtesy NSA. And even critics admit that,
with enough data to develop these training sets, analysts might be able to sift out terrorist markers. “If you can get your
arms around a big enough set of data, you'll always find something in there,” says Fred Cate, director of the Center for
Applied Cybersecurity Research at Indiana University Law School, another contributor to the NRC report. “It's not
unreasonable to think that the more data you can get access to that you might discover something of predictive value.” The
ease of mining personal data may make these systems ripe for abuse, but that ease also lends itself to a “better safe than
sorry” mindset. “There's a certain 'because it's there' nature to this,” says Cate. “If you know all these records are there,
you worry about explaining why you didn't try to get access to them” to stop a terror plot. As
more and more
revealing information finds its way online and into commercial databases, the
temptation increases for intelligence agencies to gobble up this data just in case.
But the wider the net we cast—and the broader incursion on the privacy of Americans and others—the heavier the burden
becomes to produce a terrorist or two. And to Cate's knowledge, despite extensive mining, the NSA has struck no such
motherlode. While the government has acknowledged that these latest data surveillance programs are several years old,
they have yet to trot out any concrete evidence of their efficacy. Between the NSA's dismal record, drowsy oversight from
the top-secret FISA courts and vague promises from Obama, Feinstein and others that this will all be worth it someday,
Washington should buckle up for plenty more “hype” from the civil libertarian set. Absent public exposure, independent
oversight, and robust evaluation, it's impossible to determine whether such efforts truly have anything to throw on the
scale against citizen privacy.
NSA mass surveillance results in overload
Angwin 13 (Julia, writer for WSJ, 12/25, “NSA Struggles to Make Sense of Flood of
Surveillance Data,”
http://www.wsj.com/articles/SB10001424052702304202204579252022823658850//Tang)
William Binney, creator of some of the computer code used by the National Security Agency to snoop on Internet
traffic around the world, delivered
an unusual message here in September to an audience
worried that the spy agency knows too much. It knows so much, he said, that it
can't understand what it has. "What they are doing is making themselves
dysfunctional by taking all this data," Mr. Binney said at a privacy conference here. The agency is
drowning in useless data, which harms its ability to conduct legitimate
surveillance, claims Mr. Binney, who rose to the civilian equivalent of a general during more than 30 years at the
NSA before retiring in 2001. Analysts are swamped with so much information that they
can't do their jobs effectively, and the enormous stockpile is an irresistible
temptation for misuse. Mr. Binney's warning has gotten far less attention than legal questions raised by leaks
from former NSA contractor Edward Snowden about the agency's mass collection of information around the world. Those
revelations unleashed a re-examination of the spy agency's aggressive tactics. MORE Snowden Warns of Dangers of
Citizen Surveillance But the NSA needs more room to store all the data it collects—and new phone records, data on money
transfers and other information keep pouring in. A new storage center being built in Utah will eventually be able to hold
more than 100,000 times as much as the contents of printed materials in the Library of Congress, according to outside
experts. Some of the documents released by Mr. Snowden detail concerns inside the NSA about drowning in information.
An internal briefing document in 2012 about foreign cellphone-location tracking
by the agency said the efforts were "outpacing our ability to ingest, process and
store" data. In March 2013, some NSA analysts asked for permission to collect less
data through a program called Muscular because the "relatively small intelligence
value it contains does not justify the sheer volume of collection," another document shows.
In response to questions about Mr. Binney's claims, an NSA spokeswoman says the agency is "not collecting everything,
but we do need the tools to collect intelligence on foreign adversaries who wish to do harm to the nation and its allies."
Existing surveillance programs were approved by "all three branches of government," and each branch "has a role in
oversight," she adds. In a statement through his lawyer, Mr. Snowden says: "When your working process every morning
starts with poking around a haystack of seven billion innocent lives, you're going to miss things." He adds: " We're
blinding people with data we don't need." A presidential panel recommended
earlier this month that the agency shut down its bulk collection of telephone-call
records of all Americans. The federal government could accomplish the same goal
by querying phone companies, the panel concluded. The panel also recommended
the creation of "smart software" that could sort data as the information is
collected, rather than the current system where "vast amounts of data are swept up and the sorting is done after it has
been copied" on to data-storage systems. Administration officials are reviewing the report. A
separate task force is expected to issue its own findings next year, and lawmakers have proposed several bills that would
change how the NSA collects and uses data. The 70-year-old Mr. Binney says he is generally underwhelmed by the panel's
"bureaucratic" report, though "it
would be something meaningful" if the controversy leads
to adoption of the "smart software" strategy and creation of a technology
oversight group with full access to "be in the knickers of the NSA" and Federal Bureau of
Investigation. Mr. Binney lives off his government pension and makes occasional appearances to talk about his work at the
NSA. The spy agency has defended its sweeping surveillance programs as essential in the fight against terrorism. But
having too much data can hurt those efforts, according to Mr. Binney and a
handful of colleagues who have raised concerns since losing an internal battle to
build privacy-protecting Internet surveillance tools in the late 1990s. At the time, the agency was
struggling to transform itself from a monitor of mostly analog signals, such as radio and satellite transmissions, to an
effective sleuth in the emerging digital world. Diane Roark, a House Intelligence Committee staff member assigned to
oversee the NSA, says she was "very disturbed" to learn in meetings at the agency's headquarters in Fort Meade, Md., in
1997 "what bad shape they were in." She saw a glimmer of hope in a corner of the NSA called the Sigint Automation
Research Center. Mr. Binney, who joined the agency in 1965 with a cadre of young mathematicians hired to tackle the
increasingly mathematical world of ciphers and codes, was working with the research center's chief to create an innovative
approach to monitoring Internet traffic. "Our approach was to focus on the known terrorist community, which
predominately existed overseas," recalls Ed Loomis, who ran the research center. "However, we were also interested in any
communications they had with anyone in America." The push was legally tricky. Only the FBI is allowed to collect such
information within the U.S.—and usually must prove to a judge that there is a good reason to launch surveillance. Mr.
Loomis worried that the rules were too restrictive and could hinder the NSA's terrorist-catching abilities. So Messrs.
Binney and Loomis built a system to scrape data from the Internet, throw away the content about U.S. citizens and zoom
in on the leftover metadata—or the "to" and "from" information in Internet traffic. They called it ThinThread. To keep the
data-gathering effort manageable, the two men designed ThinThread to collect data within "two hops" of a suspected bad
guy. That meant the system would be built to automatically flag people who communicated with "dirty numbers" or
possible terrorists—and records of people who contacted them. Messrs. Binney
and Loomis also believed
that ThinThread's powers should be constrained to protect the privacy of
Americans. Mr. Binney designed a way to encrypt all the U.S. metadata, and their
plans allowed the spy agency's analysts to unscramble the information only with
permission from a warrant approved by the Foreign Intelligence Surveillance
Court. The court oversees NSA activities that affect U.S. residents. ThinThread was never deployed.
Agency lawyers refused to relax a ban on recording any U.S. communications. Dickie George, a senior NSA official who
retired in 2011, says the consensus was that Mr. Binney's "heart was in the right place," but the technology wasn't ready.
Messrs. Binney and Loomis say ThinThread could have done the job for which it was built. But Mr. Loomis was told to
shut down the project. Instead, he was told, the NSA would fund a surveillance program called Trailblazer, built by outside
contractors. Distraught about the decision, Messrs. Binney and Loomis and another NSA employee, Kirk Wiebe,
announced plans to retire on Oct. 31, 2001. Mr. Binney reconsidered after the Sept. 11, 2001, terrorist attacks, but left as
intended after hearing about new plans to use his metadata-analysis technology to hunt for terrorists. There was one big
difference. The privacy protections designed to shield Americans from illegal intrusions weren't on the drawing board
anymore, he says. In 2002, the three retired NSA employees and Ms. Roark asked the Defense Department's inspector
general to investigate whether the decision to halt ThinThread and fund Trailblazer was made appropriately. Trailblazer's
data-filtering system was never built, either. Instead, NSA officials secretly sought and won support for an array of
programs to conduct warrantless wiretapping of phone and Internet content. They got similar approval to collect and
analyze metadata from nearly every U.S. phone call and vast swaths of Internet traffic. Mr. Binney settled into retirement.
But the spy agency's surveillance efforts began to draw more attention. In 2006, AT&T Inc. technician Mark Klein leaked
documents showing that the company was working with the NSA to scour the Internet with technology that was similar to
the system built by Messrs. Binney and Loomis. Outside criticism of the agency grew after articles in the New York Times
and Baltimore Sun about the agency's surveillance efforts, including ThinThread. President George W. Bush briefly shut
down the warrantless wiretapping program, but then parts of it were legalized by a new law passed in Congress.
Meanwhile, the metadata analysis program continued in secret. Federal officials suspected the three retired NSA
employees and Ms. Roark, the former House staff member, of involvement in the leaks, according to government
documents. FBI agents swooped in on all four, and Mr. Binney says agents drew their guns on him while he was in the
shower. A Justice Department official couldn't be reached for comment on the case. Messrs. Binney, Loomis and Wiebe
and Ms. Roark weren't charged with wrongdoing, but the FBI soon pursued NSA official Thomas Drake, a ThinThread
supporter. In 2010, prosecutors charged him with violating the Espionage Act, citing "willful retention" of classified
documents. Mr. Drake pleaded guilty to one count of exceeding authorized use of a government computer. Mr. Drake says
government officials "wanted to make an object lesson of me, drive the stake of national security right through me, and
then prop me out on the public commons as punishment for holding up the mirror of their own malfeasance and
usurpations of power." The raids and prosecution of Mr. Drake angered Mr. Binney. He decided to go public with his
concerns. In April 2012, he spoke at an event called a "Surveillance Teach-in" at the Whitney Museum of American Art in
New York. Wearing a short-sleeve, collared shirt and jeans, Mr. Binney looked like a grandfatherly professor amid the
crowd of activists, some wearing Anonymous masks. "I was focused on foreign threats," he said. "Unfortunately, after 9/11,
they took my solutions and directed them at this country and everybody in it." Mr. Binney's claims were hard to prove.
Even Mr. Loomis, the co-creator of ThinThread, didn't think it was possible that the same NSA lawyers who refused to
budge on the ban against recording any U.S. communications had approved more invasive surveillance procedures after
he left the agency. "After all my struggles with those folks, I just couldn't believe that they went 180 degrees against the
law," he said. In August 2012, filmmaker Laura Poitras released an eight-minute, online documentary about Mr. Binney.
She called him a whistleblower. Mr. Snowden saw the video and reached out to Ms. Poitras with an avalanche of
undisclosed documents, she says. Some of the documents leaked by the NSA contractor back up Mr. Binney. For example,
documents detailed the agency's two clandestine metadata-surveillance programs: the bulk collection of phone-calling
records and Internet traffic-analysis program. The NSA hasn't disputed the documents. The
Obama
administration says the Internet program was shut down in 2011, while the bulk
collection of phone records still is going on. John C. Inglis, the NSA's deputy director,
told lawmakers in July that the agency had court approval to do warrantless
"third-hop" queries of bulk telephone records. A "third-hop" analysis of one
suspected terrorist could allow the NSA to sift through the records of at least a
million people. Mr. Binney says he advised NSA officials to "never go beyond two
hops." He has urged lawmakers and an oversight board to limit data collection to
"two hops" and establish a technical auditing team to verify the spy agency's
claims about its data collection and usage. The presidential panel suggested
ending the bulk collection of phone metadata entirely. Instead, phone companies
should store the records and turn them over only with a court order, the panel
added. President Barack Obama will decide in coming weeks which of the panel's recommendations he will implement.
The recommendations aren't binding. In recent months, the retired computer-code creator has been greeted like a hero
almost everywhere he goes. Mr. Snowden, living in Russia under temporary asylum, says through his lawyer that he has
"tremendous respect" for Mr. Binney, "who did everything he could according to the rules."
Most recent ev
Puiu 15 – staff writer (Tibi, “The NSA is gathering so much data, it’s become swamped and
ironically ineffective at preventing terrorism,” ZME Science, 5/6/2015,
http://www.zmescience.com/research/technology/nsa-overwhelmed-data-53354/) //RGP
One of the most famous NSA whistleblowers (or the ‘original NSA whistleblower’), William Binney,
said the
agency is collecting stupendous amounts of data – so much that it’s actually
hampering intelligence operations. Binney worked for three decades for the intelligence agency, but left
shortly after the 9/11 attacks. A program he had developed was scrapped and replaced with a system he said was more
expensive and more intrusive, which made him feel he worked for an incompetent employer. Plans to enact the now
controversial Patriot Act was the last straw, so he quit. Since then, Binney has frequently criticized the agency and
revealed some of its operations hazards and weaknesses. Among these, he alleges: The
NSA buried key
intelligence that could have prevented 9/11; The agency’s bulk data collection from internet and
telephone communications is unconstitutional and illegal in the US; Electronic intelligence gathering is being used for
covert law enforcement, political control and industrial espionage, both in and beyond the US; Edward Snowden’s leaks
could have been prevented. Ironically, Snowden cites Binney as an inspiration. His greatest insights however is that the
NSA is ineffective at preventing terrorism because analysts are too swamped with
information under its bulk collection programme. Considering Binney’s impeccable track
record – he was co-founder and director of the World Geopolitical & Military
Analysis at the Signals Intelligence Automation Research Center (SARC), a branch with
6,000 employees – I can only presume he knows what he’s talking about. The Patriot Act is a U.S. law
passed in the wake of the September 11, 2001 terrorist attacks. Its goals are to strengthen domestic security and broaden
the powers of law-enforcement agencies with regards to identifying and stopping terrorists. In effect, the law laxes the
restrictions authorities have to search telephone, e-mail communications, medical, financial, and other records. Because a
lot of people use web services whose servers are located in the US, this means that the records of people not located or
doing business in the US are also spied upon by the NSA. All
this information, however, comes at a
price: overload. According to the Guardian, the NSA buffers a whooping 21 petabytes a day! In this flood
of information , an NSA analyst will quickly find himself overwhelmed.
Queering keywords like “bomb” or “drugs” might prove a nightmare for the
analyst in question. It’s impossible not to, considering four billion people — around two-thirds of the world’s
population — are under the NSA and partner agencies’ watchful eyes, according to Binney. “That’s why they
couldn’t stop the Boston bombing, or the Paris shootings, because the data was
all there,” said Binney for ZDnet. “The data was all there… the NSA is great at going back over it forensically for
years to see what they were doing before that,” he said. “But that doesn’t stop it.” So, according to Binney, analysts
still use rudimentary tools to filter the vast amounts of information the NSA is
collecting. With everybody speaking about “big data” and other such buzz phrases, it’s a bit hilarious to
think the NSA is actually safe guarding for terrorism by looking for needles in
haystacks. “The Upstream program is where the vast bulk of the information was being collected,” said Binney,
talking about how the NSA tapped undersea fiber optic cables. Basically, the NSA is collecting as much data as it can
get its hands on at this point (legally or otherwise… ), but it all seems too greedy for their own good,
not to mention public safety. According to Binney, the fact the NSA is collecting this
much data isn’t to their advantage, but actually a vulnerability.
Information overload destroys system effectiveness
Mathiesen, professor of sociology of law, ’13 (Thomas Mathiesen Professor of
Sociology of Law at the University of Oslo, “Towards a Surveillant Society: The Rise of
Surveillance Systems in Europe” Pgs 195-196, 2013,
https://books.google.com/books?id=X1ZutlZgfD8C&dq=too+much+surveillance+AND+overload
&source=gbs_navlinks_s) //GY
Many of the large surveillance system described earlier are not easy to use, or
close to unusable, when it comes to finding terrorists in advance, whether lone
wolves or groups. This goes for Eurodac, the Data Retention Directive, the various PNRs, the
API, the Prum Treaty, and perhaps also for the Schengen Information System. These five or
more systems face a common threat, namely what we may call information
overload. There is far too much information – in all of the systems, which makes
the picking out of terrorists on an individual or group basis in advance extremely
difficult. The Data Retention Directive, the various PNRs and the Prum Treaty are perhaps
particularly vulnerable to this. Take the Data Retention Directive. It collects all information
concerning communication (except content) on all citizens in a given State. The information has
to be retained for a long period of time – up to two years. Simple arithmetic tells us that
the information which has to be, and is, retained, becomes colossal. Let us say that a
particular State is small, and has roughly five million inhabitants (Norway is a small country, and
had 4,920,300 inhabitants on 1 January 2011; we will soon have 5 million). Most inhabitants
have telephones, often several mobile telephones, as well as access to the Internet and other
communication technologies. Say that communication technology equals the number of
inhabitants, five million for one year. This is clearly an underestimation, but roughly the average
retention period – Norway has in fact a retention period of one year. If the given State has
decided on mandatory retention for two years – which is the limit – the database
contains not five, but ten million technologies. However, the technology contains
a large number of data entries. If the given State has decided on mandatory retention on all
communication – who owns the communication technology which is used, who uses the
communication technology in question, at what time does the communication begin, at what time
does it end, from where is the call is taken and from where is it received, whether the caller or the
called or both are moving around during the communication, to where they have moved, all of
this and a number of other data entries for one year brings the database to an enormous number
of millions of data entries per year. After one year the data which is stored has to be deleted. But
it never ends, because a similar number of data entries are stored for each
individual and for all of the five million inhabitants for another year, and another
year and another year… Add to this that not only the inhabitants of this particular State are
in the system, but so are all of the inhabitants of all the States of Europe (and outside States, like
Norway). You end up with a fabulous number of data entries which turns the famous finding of
the needle in the haystack into a reality – to say the least. For States deciding on two years of
mandatory retention – the limit – the number of data entries will be doubled – even more
fabulous. Many of the EU States are much larger than Norway – Great Britain had 61 million
inhabitants in 2009. There are 27 large and small States in the EU. You stop counting.
Overload = nuke war
Information overload leads to a cyber crash – outweighs nuclear war
Goor, physicist and political scientist, MA Law, ’13 (Dan Goor, political
scientist, MA in Law and Diplomacy, “PRISM, a symptom of “information explosion,” beware!”
2013, https://dangoor.wordpress.com/2013/07/02/prism-a-symptom-of-information-explosionbeware/) //GY
PRISM, a symptom of “information explosion,” beware!¶ While the political and
security implications of leaks by Edward Snowden are monopolizing the news, the main
danger is from information overload, misinterpretations and perhaps dangerous
(or even rogue) action could be the main issue.¶ Too much information leads to
chaos, In the mid fifteen hundreds, Miguel de Cervantes Saavedra predicted that too much
information will drive people insane by demonstrating how his hero, Don Quixote, went mad
because he read too much. Albert Einstein said the: “I fear the day when the [information]
technology overlaps with our humanity; the world will only have a generation of idiots.” ¶ In
George Orwell’s 1984 he wrote of government that has total visibility to what every person does,
and soon we shall be able to read people’s minds, to know what thoughts each person may have. ¶
Communication is a complex process, which land itself to high level of misinterpretations. With
Government monitoring everything its citizenry does, and take action based on
interpretation by both man and machine, one can expect an eventual state of
chaos in the world.¶ This year NSA’s one and one half million square foot facility in Utah
would become operational, it would accommodate the trillions of bits of information that NSA is
gathering from the United States and from around the world. Following is Wikipedia overview of
the NSA facility:¶ “The Utah Data Center, also known as the Intelligence Community
Comprehensive National Cybersecurity Initiative Data Center,[1] is a data storage facility for the
United States Intelligence Community that is designed to store extremely large amounts of
data.[2][3][4] Its purpose is to support the Comprehensive National Cybersecurity Initiative
(CNCI), though its precise mission is classified.[5] The National Security Agency (NSA), which
will lead operations at the facility, is the executive agent for the Director of National
Intelligence.[6] It is located at Camp Williams, near Bluffdale, Utah, between Utah Lake and
Great Salt Lake.Ӧ The Google information about judicial requests from various
countries supports the notion that the world is moving towards an information
overload, the world is leading towards a “cyber crash,” that could well
dwarf any nuclear confrontation that may confront the human race.¶ Should,
or could, safeguards be put in place to prevent information from going wild? ¶ Several years ago
when Gordon Moore of Intel predicted that every few years computation power
would double, an alarm should have sounded. Moore was close to correct, except
that information technology is growing even faster and could become an
avalanche out of control.¶ It is likely that the human race will survive the “cyber explosion,”
just as it survived Malthus prediction of resource shortage, of atomic annihilation. That not
withstanding, the prudent thing for both scientists and politicians to exercise some
rational control on information growth.
Iran !
Overload causes Iran war
Trobock, 14
Randall Trubbock, Master of Science in Cybersecurity from Utica College, May 2014, “The
Application Of Splunk In The Intelligence Gathering Process,” Proquest // is amp
Faulty intelligence methods, such as those that would be the result of
information overload, pose a significant threat to peace throughout the
world. For example, having inaccurate or incomplete intelligence on Iran’s nuclear
capabilities and the locations or nature of its nuclear plants, a risk-averse Israel
might overestimate its need to take both drastic and pre-emptive measures
against Iran (Young, 2013). This could result in involvement from several countries,
including the United States, potentially costing billions of dollars and thousands
of soldiers’ lives.
Turns aff – drones
Despite a plethora of surveillance technology border surveillance fails
due to data overload
Abrams and Cyphers No date (David, Chief Technology Officer, True Systems, Dennis,
VP Sales Operations, “TrueSentry Border Surveillance”,
http://www.daveab.com/files/TrueSentry_Border_Surveillance.pdf//Tang)
BORDER SECURITY remains a key homeland security challenge. Border guards,
surveillance operators, and command staff do not have an integrated command
and control system to protect national borders. There is a lack of sufficient coverage from sensors and cameras.
Important threats are lost in an overload of information from false alarms. The
collaboration and communications needed for tactical intercept missions is lacking. Wide and diverse terrain
coupled with large-scale population centers, sea ports, and national boundaries
make a difficult environment to effectively scale-up border surveillance. Far-field
cameras, thermal vision, and pan/tilt/zoom cameras are used to remotely monitor border
zones. Unmanned aerial vehicles (UAVs) fly continuous GPS-guided missions to give operators a bird’s eye view.
Surveillance towers typically use radar as a cost-effective way to get broad sensor coverage of diverse border
zone terrain. Intrusion detection systems like underground buried cables are used to monitor
electromagnetic field changes to distinguish between people, vehicles and animals at a perimeter. Outdoor motion
detection is also done with microwave and infrared sensors. Intelligent pressure fence sensors detect
when intruders climb or cut a fence-line. Yet for all these advances in surveillance equipment we
are still left with border guards struggling with high false alarm rates and low
probability of detection and intercept. There are just too many cameras and not
enough border forces to monitor them all. Threats are lost because of too many
false alarms. The cost of verifying targets, escalating them into threats, and
dispatching response teams is too high. Intercept mission teams do not have
effective collaboration tools. Intelligence and threat pattern analysis is just too
time consuming to thwart the next intrusion.
Border drones are ineffective and costly
Bennet 1/7 (Brian, reporter for the LA Times, 1/7/15, “Border drones are ineffective, badly
managed, too expensive, official says”, http://www.latimes.com/nation/immigration/la-naborder-drones-20150107-story.html//Tang)
Drones patrolling the U.S. border are poorly managed and ineffective at stopping illegal
immigration, and the government should abandon a $400-million plan to expand their use,
according to an internal watchdog report released Tuesday. The 8-year-old drone program has cost more than expected, according to a
Rather than spend more on
drones, the department should "put those funds to better use," Roth recommended. He
described the Predator B drones flown along the border by U.S. Customs and
Border Protection as "dubious achievers." "Notwithstanding the significant investment, we see no
evidence that the drones contribute to a more secure border, and there is no
reason to invest additional taxpayer funds at this time," Roth said in a statement. The audit
concluded that Customs and Border Protection could better use the funds on manned
report by the Department of Homeland Security's inspector general, John Roth.
aircraft and ground surveillance technology. The drones were designed to fly over the border to spot
smugglers and illegal border crossers. But auditors found that 78% of the time that agents had planned to
use the craft, they were grounded because of bad weather, budget constraints or
maintenance problems. Even when aloft, auditors found, the drones contributed little. Three drones flying around the
Tucson area helped apprehend about 2,200 people illegally crossing the border in 2013, fewer than 2% of the 120,939 apprehended that
Border Patrol supervisors had planned on using drones to inspect
ground-sensor alerts. But a drone was used in that scenario only six times in
2013. Auditors found that officials underestimated the cost of the drones by
leaving out operating costs such as pilot salaries, equipment and overhead. Adding
such items increased the flying cost nearly fivefold, to $12,255 per hour. People think these kinds of
surveillance technologies will be a silver bullet. Time after time, we see the
practical realities of these systems don't live up to the hype. - Jay Stanley, ACLU privacy expert "It
year in the area.
really doesn't feel like [Customs and Border Protection] has a good handle on how it is using its drones, how much it costs to operate the
drones, where that money is coming from or whether it is meeting any of its performance metrics," said Jennifer Lynch, a lawyer for the
. The report's conclusions will
make it harder for officials to justify further investment in the border surveillance
drones, especially at a time when Homeland Security's budget is at the center of
the battle over President Obama's program to give work permits to millions of immigrants in the country illegally. Each
Predator B system costs about $20 million. "People think these kinds of surveillance technologies will be a
Electronic Frontier Foundation, a San Francisco-based privacy and digital rights group
silver bullet," said Jay Stanley, a privacy expert at the American Civil Liberties Union. "Time after time, we see the practical realities of these
systems don't live up to the hype." Customs and Border Protection, which is part of Homeland Security, operates the fleet of nine long-range
The agency purchased 11 drones, but one
crashed in Arizona in 2006 and another fell into the Pacific Ocean off San Diego
after a mechanical failure last year. Agency officials said in response to the audit that they had no plans to expand
Predator B drones from bases in Arizona, Texas and North Dakota.
the fleet aside from replacing the Predator that crashed last year. The agency is authorized to spend an additional $433 million to buy up to
14 more drones. The drones — unarmed versions of the MQ-9 Reaper drone flown by the Air Force to hunt targets in Pakistan, Somalia and
elsewhere — fly the vast majority of their missions in narrowly defined sections of the Southwest border, the audit found. They spent most
of their time along 100 miles of border in Arizona near Tucson and 70 miles of border in Texas. Rep. Henry Cuellar (D-Texas) has promoted
Homeland
Security "can't prove the program is effective because they don't have the right
measures," Cuellar said in an interview. "The technology is good, but how you implement and
use it — that is another question." The audit also said that drones had been flown
to help the FBI, the Texas Department of Public Safety and the Minnesota
Department of Natural Resources. Such missions have long frustrated Border
Patrol agents, who complain that drones and other aircraft aren't available when
they need them, said Shawn Moran, vice president of the Border Patrol agents' union. "We saw the drones were being lent out to
the use of drones along the border but believes the agency should improve how it measures their effectiveness.
many entities for nonborder-related operations and we said, 'These drones, if they belong to [Customs and Border Protection], should be
used to support [its] operations primarily,'" Moran said.
Turns aff – generic
Additional surveillance directly trades off with security concerns –
the more information we have the less effective counter terror
measures are
Greenwald 10 (Glenn, constitutional lawyer, 8/9, “The Digital Surveillance State: Vast,
Secret, and Dangerous”, http://www.cato-unbound.org/2010/08/09/glenn-greenwald/digitalsurveillance-state-vast-secret-dangerous//Tang)
this leviathan particularly odious is that it does not even supply the security which is
endlessly invoked to justify it. It actually does the opposite. As many surveillance
experts have repeatedly argued, including House Intelligence Committee member Rush Holt, the more
secret surveillance powers we vest in the government, the more unsafe we
become. Cato’s Julian Sanchez put it this way: “We’ve gotten so used to the ‘privacy/security tradeoff’ that it’s worth reminding
ourselves, every now and again, that surrendering privacy does not automatically make us more secure—that systems of
surveillance can themselves be a major source of insecurity.” That’s because the
Surveillance State already collects so much information about us, our activities
and our communications—so indiscriminately and on such a vast scale—that it is
increasingly difficult for it to detect any actual national security threats. NSA whistle
blower Adrienne Kinne, when exposing NSA eavesdropping abuses, warned of what ABC News described as “the waste
of time spent listening to innocent Americans, instead of looking for the terrorist
needle in the haystack.” As Kinne explained: By casting the net so wide and continuing to collect on
Americans and aid organizations, it’s almost like they’re making the haystack bigger and it’s
harder to find that piece of information that might actually be useful to somebody.
You’re actually hurting our ability to effectively protect our national security. As the
What makes
Post put it in its “Top Secret America” series: The NSA sorts a fraction of those [1.7 billion e-mails, phone calls and other types of daily
collected communications] into 70 separate databases. The same problem bedevils every other intelligence agency, none of which have
ample information regarding alleged Ft.
Hood shooter Nidal Hassan and attempted Christmas Day bomber Umar
Abdulmutallab was collected but simply went unrecognized. Similarly, The Washington Post’s
enough analysts and translators for all this work. That article details how
David Ignatius previously reported that Abdulmutallab was not placed on a no-fly list despite ample evidence of his terrorism connections
information overload “clogged” the surveillance system and prevented its
being processed. Identically, Newsweek’s Mike Isikoff and Mark Hosenball documented that U.S. intelligence
agencies intercept, gather and store so many emails, recorded telephone calls,
and other communications that it’s simply impossible to sort through or
understand what they have, quite possibly causing them to have missed crucial
evidence in their possession about both the Fort Hood and Abdulmutallab plots:
because
This deluge of Internet traffic—involving e-mailers whose true identity often is not apparent—is one indication of the volume of raw
The large volume of messages also may help to
explain how agencies can become so overwhelmed with data that sometimes it is
difficult, if not impossible, to connect potentially important dots. As a result, our vaunted
intelligence U.S. spy agencies have had to sort through … .
Surveillance State failed to stop the former attack and it was only an alert airplane passenger who thwarted the latter. So it isn’t that we
we
keep sacrificing our privacy to the always-growing National Security State in
exchange for less security.
keep sacrificing our privacy to an always-growing National Security State in exchange for greater security. The opposite is true:
Info overload will collapse the surveillance state
North 13 (Gary, American Christian Reconstructionist theorist and economic historian. 7/29,
“Surveillance state will collapse; data overload increasingly blinds it”,
http://nooganomics.com/2013/07/surveillance-state-will-collapse-data-overload-increasinglyblinds-it///Tang)
Wyden trusts in the wisdom and power of political democracy. He is naive. He should trust in the free market. People’s day-to-day economic decisions are the heart
of the matter, not their occasional voting. The individual decisions of people in the market will ultimately thwart Congress and the surveillance state. The free
The bureaucrats’ quest for omniscience and
omnipotence will come to a well-deserved end, just as it did in the Soviet Union, and for the same reason. The
state is inherently myopic: short-sighted. Computers make it blind. The state
focuses on the short run. Computers overwhelm bureaucrats with short-run
information. Let us not forget that the Internet was invented by DARPA: the military’s
research branch. It invented the Internet to protect the military’s
communications network from a nuclear attack by the USSR. Today, there is no USSR. There is the
World Wide Web: the greatest technological enemy of the state since Gutenberg’s printing press. The state is myopic. The fact that the NSA’s
two “computer farms” — in Utah and in Maryland — are seven times larger than
the Pentagon will not change this fact. They have bitten off more than they can
chew. Central planners are bureaucrats, and bureaucracy is blind. It cannot
assess accurately the importance of the mountains of data that are hidden in governmentcollected and program-assessed digits. The knowledge possessed in the free market is always more relevant. Society is the
market’s signals, not the phone taps of the NSA, will shape the future.
result of human action, not of human design. The bureaucrats do not understand this principle, and even if they did, it would not change reality.
Turns aff - NCTC
The size of the NCTC should be reduced to more effectively combat
terrorism
Storm 13 (Darlene, freelance writer, citing Bridget Nolan, sociology phd, worked as a CT
analyst at the NCTC, 8/7 “Is US intelligence so big that counterterrorism is failing? 'Yes' say
insiders”, http://www.computerworld.com/article/2475096/security0/is-us-intelligence-so-bigthat-counterterrorism-is-failing---yes--say-insiders.html//Tang)
When might you consider quitting your job to be a “win”? When you work for the CIA and “the Company” tries to block the publication of
your dissertation about the National Counterterrorism Center. Bridget
Rose Nolan, a sociology PhD at the University
of Pennsylvania, worked as a counterterrorism analyst at the National Counterterrorism Center
(NCTC) from 2010 – 2011. Basically she worked as an analyst while also conducting “ethnographic observations” by interviewing
16 female and seven male analysts for her doctoral dissertation at the University of Pennsylvania. The Philadelphia Inquirer explained, “She
set out to explore the culture of the terrorism center and how it, and its counterparts, share information – or fail to.” After three years of
“fighting” the CIA over the right to publish, she won, but the “win” meant she had to resign. Instead
of too big to fail, in
essence, counterterrorism may be failing in some areas because it is too big, because
counterterrorism analysts suffer from so much information overload that they are not effective in
stopping terrorism. “Fewer people in the system could help to streamline the bureaucracy and
reduce the number of emails and documents that make the analysts feel overwhelmed with
information.” Other contributing factors that make terrorism harder to fight include sabotage
among co-workers, stove-piping, confusion, bureaucracy that might make your head explode, and agencies
that don’t play well together. Several people working in counterterrorism suggested that the
solutions to be more effective include cutting out the bloat and making the intelligence
community smaller, much smaller. NCTC was formed as a “knee-jerk reaction to 9/11.” The continuing War on Terror leads
to more databases, more information which creates more stove-piping. There is no Google-like search to find information from one agency
to the other, and each intelligence agency hoards the good secret stuff for itself. One analyst suggested that NCTC was “never intended to be
real. That all along, it’s just been a CYA [‘cover your ass’] political maneuver.” Another
CT analyst suggested that if NCTC
were to continue, then it “should be about one-tenth its current size.” When it comes to intelligence
information, analysts must “publish or perish;” but it’s more about quantity than quality. There are endless “turf battles”
complete with paper ownership battles as well as “strategies of deception and sabotage.” Even
analysts working for the same agency might try to stall in order to “scoop” another analyst ; they also
might try to “kill” the piece. That’s before the six months to two years for official reviews of the papers, “layers and layers of soul-crushing
review.” There
is also “a lack of faith in management both as qualified reviewers and as unbiased
supporters.” One analyst said, “Information sharing is when YOU give ME your data .”
Turns aff – NSA
NSA failing now – their drowning in information – only the plan
makes surveillance effective
Angwin 13 – staff writer (“NSA Struggles to Make Sense of Flood of Surveillance Data,” WSJ,
12/25/2013,
http://www.wsj.com/articles/SB10001424052702304202204579252022823658850) //RGP
*Language edited
LAUSANNE, Switzerland— William
Binney, creator of some of the computer code used by the National Security
Agency to snoop on Internet traffic around the world, delivered an unusual message here in
September to an audience worried that the spy agency knows too much. It knows
so much, he said, that it can't understand what it has . "What they are doing is
making themselves dysfunctional by taking all this data," Mr. Binney said at a privacy
conference here. The agency is drowning in useless data, which harms its ability to
conduct legitimate surveillance, claims Mr. Binney, who rose to the civilian equivalent of a general during
more than 30 years at the NSA before retiring in 2001. Analysts are swamped with so much
information that they can't do their jobs effectively, and the enormous stockpile is an irresistible
temptation for misuse. Mr. Binney's warning has gotten far less attention than legal questions raised by leaks from former
NSA contractor Edward Snowden about the agency's mass collection of information around the world. Those revelations
unleashed a re-examination of the spy agency's aggressive tactics. MORE Snowden Warns of Dangers of Citizen
Surveillance But the NSA needs more room to store all the data it collects—and new phone records, data on money
transfers and other information keep pouring in. A new storage center being built in Utah will eventually be able to hold
more than 100,000 times as much as the contents of printed materials in the Library of Congress, according to outside
experts. Some of the documents released by Mr. Snowden detail concerns inside the NSA about drowning in information.
An internal briefing document in 2012 about foreign cellphone-location tracking
by the agency said the efforts were "outpacing our ability to ingest, process and
store" data. In March 2013, some NSA analysts asked for permission to collect less data through a program called
Muscular because the "relatively small intelligence value it contains does not justify the sheer volume of collection,"
another document shows. In response to questions about Mr. Binney's claims, an NSA spokeswoman says the agency is
"not collecting everything, but we do need the tools to collect intelligence on foreign adversaries who wish to do harm to
the nation and its allies." Existing surveillance programs were approved by "all three branches of government," and each
branch "has a role in oversight," she adds. In a statement through his lawyer, Mr. Snowden says: "When
your
working process every morning starts with poking around a haystack of seven
billion innocent lives, you're going to miss things." He adds: "We're blinding
[overwhelming] people with data we don't need ."
Status quo NSA operations are ineffective due to information
overload – makes preventing terrorism imposssible
Maass, acclaimed author and journalist on surveillance, 5/28 (PETER
MAASS “INSIDE NSA, OFFICIALS PRIVATELY CRITICIZE “COLLECT IT ALL”
SURVEILLANCE” 05/28/2015 11:38 AM https://firstlook.org/theintercept/2015/05/28/nsaofficials-privately-criticize-collect-it-all-surveillance/) //GY
AS MEMBERS OF CONGRESS struggle to agree on which surveillance programs to re-authorize
before the Patriot Act expires, they might consider the unusual advice of an intelligence analyst at
the National Security Agency who warned about the danger of collecting too much
data. Imagine, the analyst wrote in a leaked document, that you are standing in a shopping aisle
trying to decide between jam, jelly or fruit spread, which size, sugar-free or not, generic or
Smucker’s. It can be paralyzing.¶ “We in the agency are at risk of a similar, collective paralysis in
the face of a dizzying array of choices every single day,” the analyst wrote in 2011. “’Analysis
paralysis’ isn’t only a cute rhyme. It’s the term for what happens when you spend
so much time analyzing a situation that you ultimately stymie any outcome …. It’s
what happens in SIGINT [signals intelligence] when we have access to endless
possibilities, but we struggle to prioritize, narrow, and exploit the best ones.Ӧ The
document is one of about a dozen in which NSA intelligence experts express concerns usually
heard from the agency’s critics: that the U.S. government’s “collect it all” strategy can
undermine the effort to fight terrorism. The documents, provided to The Intercept by NSA
whistleblower Edward Snowden, appear to contradict years of statements from senior officials
who have claimed that pervasive surveillance of global communications helps the government
identify terrorists before they strike or quickly find them after an attack.¶ The Patriot Act, portions
of which expire on Sunday, has been used since 2001 to conduct a number of dragnet surveillance
programs, including the bulk collection of phone metadata from American companies. But the
documents suggest that analysts at the NSA have drowned in data since 9/11,
making it more difficult for them to find the real threats. The titles of the documents
capture their overall message: “Data Is Not Intelligence,” “The Fallacies Behind the Scenes,”
“Cognitive Overflow?” “Summit Fever” and “In Praise of Not Knowing.” Other titles include
“Dealing With a ‘Tsunami’ of Intercept” and “Overcome by Overload?”¶ The documents are not
uniform in their positions. Some acknowledge the overload problem but say the agency is
adjusting well. They do not specifically mention the Patriot Act, just the larger dilemma of cutting
through a flood of incoming data. But in an apparent sign of the scale of the problem, the
documents confirm that the NSA even has a special category of programs that is
called “Coping With Information Overload.”¶ The jam vs. jelly document, titled “Too
Many Choices,” started off in a colorful way but ended with a fairly stark warning: “The SIGINT
mission is far too vital to unnecessarily expand the haystacks while we search for
the needles. Prioritization is key.Ӧ These doubts are infrequently heard from officials inside
the NSA. These documents are a window into the private thinking of mid-level
officials who are almost never permitted to discuss their concerns in public.¶ AN
AMUSING PARABLE circulated at the NSA a few years ago. Two people go to a farm and
purchase a truckload of melons for a dollar each. They then sell the melons along a busy road for
the same price, a dollar. As they drive back to the farm for another load, they realize they aren’t
making a profit, so one of them suggests, “Do you think we need a bigger truck?” ¶ The parable was
written by an intelligence analyst in a document dated Jan. 23, 2012 that was titled, “Do We Need
a Bigger SIGINT Truck?” It expresses, in a lively fashion, a critique of the agency’s effort to collect
what former NSA Director Keith Alexander referred to as “the whole haystack.” The critique
goes to the heart of the agency’s drive to gather as much of the world’s
communications as possible: because it may not find what it needs in a partial
haystack of data, the haystack is expanded as much as possible, on the
assumption that more data will eventually yield useful information. The Snowden
files show that in practice, it doesn’t turn out that way: more is not necessarily
better, and in fact, extreme volume creates its own challenges.¶ “Recently I tried to
answer what seemed like a relatively straightforward question about which telephony metadata
collection capabilities are the most important in case we need to shut something off when the
metadata coffers get full,” wrote the intelligence analyst. “By the end of the day, I felt like
capitulating with the white flag of, ‘We need COLOSSAL data storage so we don’t have to worry
about it,’ (aka we need a bigger SIGINT truck).” The analyst added, “Without metrics, how
do we know that we have improved something or made it worse? There’s a
running joke … that we’ll only know if collection is important by shutting it off
and seeing if someone screams.Ӧ Another document, while not mentioning the
dangers of collecting too much data, expressed concerns about pursuing
entrenched but unproductive programs.¶ “How many times have you been watching a
terrible movie, only to convince yourself to stick it out to the end and find out what happens, since
you’ve already invested too much time or money to simply walk away?” the document asked.
“This ‘gone too far to stop now’ mentality is our built-in mechanism to help us
allocate and ration resources. However, it can work to our detriment in
prioritizing and deciding which projects or efforts are worth further expenditure
of resources, regardless of how much has already been ‘sunk.’ As has been said before, insanity
is doing the same thing over and over and expecting different results.” Many of these documents
were written by intelligence analysts who had regular columns distributed on NSANet, the
agency’s intranet. One of the columns was called “Signal v. Noise,” another was called “The
SIGINT Philosopher.” Two of the documents cite the academic work of Herbert Simon, who won
a Nobel Prize for his pioneering research on what’s become known as the attention economy.
Simon wrote that consumers and managers have trouble making smart choices because their
exposure to more information decreases their ability to understand the information. Both
documents mention the same passage from Simon’s essay, Designing Organizations for an
Information-Rich World:¶ “In an information-rich world, the wealth of information means a
dearth of something else: a scarcity of whatever it is that information consumes. What
information consumes is rather obvious: it consumes the attention of its recipients. Hence a
wealth of information creates a poverty of attention and a need to allocate that
attention efficiently among the overabundance of information sources that might
consume it.Ӧ In addition to consulting Nobel-prize winning work, NSA analysts have turned to
easier literature, such as Malcolm Gladwell’s best-selling Blink: The Power of Thinking Without
Thinking. The author of a 2011 document referenced Blink and stated, “The key to good decision
making is not knowledge. It is understanding. We are swimming in the former. We are
desperately lacking in the latter.” The author added, “Gladwell has captured one of the biggest
challenges facing SID today. Our costs associated with this information overload are
not only financial, such as the need to build data warehouses large enough to store the
mountain of data that arrives at our doorstep each day, but also include the more
intangible costs of too much data to review, process, translate and report.Ӧ
Alexander, the NSA director from 2005 to 2014 and chief proponent of the agency’s “collect it all”
strategy, vigorously defended the bulk collection programs. “What we have, from my perspective,
is a reasonable approach on how we can defend our nation and protect our civil liberties and
privacy,” he said at a security conference in Aspen in 2013. He added, “You need the haystack to
find the needle.” The same point has been made by other officials, including James Cole, the
former deputy attorney general who told a congressional committee in 2013, “If you’re looking for
the needle in the haystack, you have to have the entire haystack to look through.” The opposing
viewpoint was voiced earlier this month by Snowden, who noted in an interview with the
Guardian that the men who committed recent terrorist attacks in France, Canada and Australia
were under surveillance—their data was in the haystack yet they weren’t singled out. “It wasn’t the
fact that we weren’t watching people or not,” Snowden said. “It was the fact that we were watching
people so much that we did not understand what we had. The problem is that when you
collect it all, when you monitor everyone, you understand nothing.Ӧ In a 2011
interview with SIDtoday, a deputy director in the Signals Intelligence Directorate was asked about
“analytic modernization” at the agency. His response, while positive on the NSA’s ability to
surmount obstacles, noted that it faced difficulties, including the fact that some targets use
encryption and switch phone numbers to avoid detection. He pointed to volume as a particular
problem.¶ “We live in an Information Age when we have massive reserves of information and
don’t have the capability to exploit it,” he stated. “I was told that there are 2 petabytes of data in
the SIGINT System at any given time. How much is that? That’s equal to 20 million 4-drawer
filing cabinets. How many cabinets per analyst is that? By the end of this year, we’ll have 1
terabyte of data per second coming in. You can’t crank that through the existing
processes and be effective.Ӧ The documents noted the difficulty of sifting
through the ever-growing haystack of data. For instance, a 2011 document titled “ELINT
Analysts – Overcome by Overload? Help is Here with IM&S” outlined a half dozen computer tools
that “are designed to invert the paradigm where an analyst spends more time searching for data
than analyzing it.” Another document, written by an intelligence analyst in 2010, bluntly stated
that “we are drowning in information. And yet we know nothing. For sure.” The analyst went on
to ask, “Anyone know just how many tools are available at the Agency, alone? Would you know
where to go to find out? Anyone ever start a new target…without the first clue where to begin? Did
you ever start a project wondering if you were the sole person in the Intelligence Community to
work this project? How would you find out?” The analyst, trying to encourage more sharing of tips
about the best ways to find data in the haystack, concluded by writing, in boldface, “Don’t let
those coming behind you suffer the way you have.Ӧ
Overload makes surveillance ineffective and infringes on civil rights
Ward, journalist, 5/18 (Stan Ward, correspondent for Best VPN “NSA swamped with
data overload also trashes the Constitution” 18 May 2015
https://www.bestvpn.com/blog/19187/nsa-swamped-with-data-overload-also-trashes-theconstitution/) //GY
Almost on the second anniversary of the Edward Snowden revelations, another (in)famous NSA
whistleblower has again spoken up. This comes at a pivotal juncture in the legislative calendar as
contentious debate about surveillance rages over the impending sunset of some of the Patriot
Act.¶ It has long been an argument of the civil liberties crowd that bulk data
gathering was counter-productive, if not counter- intuitive. The argument was
couched in language suggesting that to “collect it all”, as the then NSA director James
Clapper famously decried, was to, in effect, gather nothing, as the choking amounts of
information collected would be so great as to be unable to be analyzed effectively.¶
This assertion is supported by William Binney, a founder of Contrast Security and a former NSA
official, logging more than three decades at the agency. In alluding to what he termed
“bulk data failure”, Binney said that an analyst today can run one simple query
across the NSA’s various databases, only to become immediately overloaded with
information.¶ With about four billion people (around two-thirds of the world’s population)
under the NSA and partner agencies’ watchful eyes, according to his estimates, there is far too
much data being collected.¶ “That’s why they couldn’t stop the Boston bombing, or the Paris
shootings, because the data was all there… The data was all there… the NSA is great at going back
over it forensically for years to see what they were doing before that. But that doesn’t stop it.” ¶
Binney is in a position to know, earning his stripes during the terrorism build up that culminated
with the 9/11 World Trade Center bombing in 2001. He left just days after the draconian
legislation known as the USA Patriot Act was enacted by Congress on the heels of that attack. One
of the reasons which prompted his leaving was the scrapping of a surveillance system on which he
long worked, only to be replaced by more intrusive systems.¶ It is interesting to note here that
Edward Snowden, in alluding to Binney, said he was inspired by Binney’s plight, and that this, in
part, prodded him to leak thousands of classified documents to journalists. Little did Binney
know that his work was to be but the tip of the iceberg in a program that eventually grew to
indiscriminately “collect it all.”¶ What is worrisome is the complicity with the bulk data collection
by dozens of private companies – maybe as many as 72. Yet this type of collection pales in
comparison to that of the “Upstream” program in which the NSA tapped into undersea fiber optic
cables. With the cooperation of Britain’s GCHQ, the NSA is able to sift more than 21 petabytes a
day.¶ Gathering such enormous amounts of information is expensive and
ineffective, according to Binney. But it gets lawmakers attention in a way that results
in massive increases in NSA budgets. Binney warns that,¶ “They’re taking away
half of the Constitution in secret.”¶ President Obama has presided over this agency’s land
grab, and has endorsed it, often to referring to Upstream as a “critical national security tool .” His
feckless approach to the spying build up is the reason for its proliferation, and is
why Congress meanders rudderless in attempts to curtail it.¶ The President’s
anti-privacy stance is being “rewarded” by repudiation among members of his
own party, and is reflected in their rejecting his latest legacy-building, pet piece of legislation –
the Trans Pacific Partnership (TPP). But their constituents would be better served by
producing legislation that would restore Constitutional rights trampled on by the
NSA.
Turns aff – NSA – grid collapse
Strain on surveillance systems threatens power disruptions –
collapses the agency
Gorman, ’06 (SIOBHAN GORMAN, senior reporter Baltimore Sun “NSA risking electrical
overload” August 06, 2006 http://articles.baltimoresun.com/2006-0806/news/0608060158_1_agency-power-surges-nsa/3) //GY
WASHINGTON -- The National Security Agency is running out of juice.¶ The demand for
electricity to operate its expanding intelligence systems has left the high-tech
eavesdropping agency on the verge of exceeding its power supply, the lifeblood of
its sprawling 350-acre Fort Meade headquarters, according to current and former intelligence
officials.¶ Agency officials anticipated the problem nearly a decade ago as they looked ahead at the
technology needs of the agency, sources said, but it was never made a priority, and now
the agency's ability to keep its operations going is threatened. The NSA is already
unable to install some costly and sophisticated new equipment, including two new
supercomputers, for fear of blowing out the electrical infrastructure, they said.¶ At minimum, the
problem could produce disruptions leading to outages and power surges at the
Fort Meade headquarters, hampering the work of intelligence analysts and
damaging equipment, they said. At worst, it could force a virtual shutdown of the
agency, paralyzing the intelligence operation, erasing crucial intelligence data
and causing irreparable damage to computer systems -- all detrimental to the
fight against terrorism.¶ Estimates on how long the agency has to stave off such an overload
vary from just two months to less than two years. NSA officials "claim they will not be able to
operate more than a month or two longer unless something is done," said a former senior NSA
official familiar with the problem, who spoke on condition of anonymity.¶ Agency leaders,
meanwhile, are scrambling for stopgap measures to buy time while they develop a sustainable
plan. Limitations of the electrical infrastructure in the main NSA complex and the
substation serving the agency, along with growing demand in the region, prevent
an immediate fix, according to current and former government officials.¶ "If there's a major
power failure out there, any backup systems would be inadequate to power the
whole facility," said Michael Jacobs, who headed the NSA's information assurance division
until 2002.¶ "It's obviously worrisome, particularly on days like today," he said in an interview
during last week's barrage of triple-digit temperatures.¶ William Nolte, a former NSA executive
who spent decades with the agency, said power disruptions would severely hamper the
agency.¶ "You've got an awfully big computer plant and a lot of precision equipment, and I don't
think they would handle power surges and the like really well," he said. "Even re-calibrating
equipment would be really time consuming -- with lost opportunities and lost up-time."¶ Power
surges can also wipe out analysts' hard drives, said Matthew Aid, a former NSA analyst
who is writing a multivolume history of the agency. The information on those hard drives is so
valuable that many NSA employees remove them from their computers and lock them in a safe
when they leave each day, he said.¶ A half-dozen current and former government officials
knowledgeable about the energy problem discussed it with The Sun on condition of anonymity
because of the sensitivity of the issue.¶ NSA spokesman Don Weber declined to comment on
specifics about the NSA's power needs or what is being done to address them, saying that even
private companies consider such information proprietary.¶ In a statement to The Sun, he said that
"as new technologies become available, the demand for power increases and NSA must determine
the best and most economical way to use our existing power and bring on additional capacity." ¶
Biggest BGE customer¶ The NSA is Baltimore Gas & Electric's largest customer, using as much
electricity as the city of Annapolis, according to James Bamford, an intelligence expert and author
of two comprehensive books on the agency.¶ BGE spokeswoman Linda Foy acknowledged a power
company project to deal with the rising energy demand at the NSA, but she referred questions
about it to the NSA.¶ The agency got a taste of the potential for trouble Jan. 24, 2000,
when an information overload, rather than a power shortage, caused the NSA's
first-ever network crash. It took the agency 3 1/2 days to resume operations, but with a
power outage it could take considerably longer to get the NSA humming again. ¶ The 2000
shutdown rendered the agency's headquarters "brain-dead," as then-NSA Director Gen. Michael
V. Hayden told CBS's 60 Minutes in 2002.¶ "I don't want to trivialize this. This was really bad,"
Hayden said. "We were dark. Our ability to process information was gone."¶ As an
immediate fallback measure, the NSA sent its incoming data to its counterpart in Great Britain,
which stepped up efforts to process the NSA's information along with its own, said Bamford. ¶ The
agency came under intense criticism from members of Congress after the crash, and the incident
rapidly accelerated efforts to modernize the agency.¶ One former NSA official familiar with the
electricity problem noted a sense of deja vu six years later. "To think that this was not a priority
probably tells you more about the extent to which NSA has actually transformed," the former
official said. "In the end, if you don't have power, you can't do [anything]."¶ Already
some equipment is not being sufficiently cooled, and agency leaders have forgone plugging in
some new machinery, current and former government officials said. The power shortage will also
delay the installation of two new, multimillion-dollar supercomputers, they said.¶ To begin to
alleviate pressure on the electrical grid, the NSA is considering buying additional generators and
shutting down so-called "legacy" computer systems that are decades old and not considered
crucial to the agency's operations, said three current and former government officials familiar
with the situation.¶
Turns aff – UPSTREAM
Information collected by UPSTREAM is ineffective and
counterproductive – causes info overload which makes terror attacks
less likely to be detected
Whittaker 4/30 (Zach, writer-editor for ZDNet, and sister sites CNET and CBS News.,
4/30/15, “NSA is so overwhelmed with data, it's no longer effective, says whistleblower,”
http://www.zdnet.com/article/nsa-whistleblower-overwhelmed-with-data-ineffective//Tang)
A former National Security Agency official turned whistleblower has spent almost a decade and a half in
civilian life. And he says he's still "pissed" by what he's seen leak in the past two years. In a lunch meeting hosted by Contrast Security
William Binney, a former NSA official who spent more than three
decades at the agency, said the US government's mass surveillance programs
have become so engorged with data that they are no longer effective, losing vital
intelligence in the fray. That, he said, can -- and has -- led to terrorist attacks
succeeding. As the Snowden leaks began, there was "fear and panic" in Congress Just a few minutes after the first NSA leak was
founder Jeff Williams on Wednesday,
published, the phones of US lawmakers began to buzz, hours before most of America would find out over their morning coffee. Binney said
an analyst today can run one simple query across the NSA's various databases,
only to become immediately overloaded with information. With about four billion people -- around
two-thirds of the world's population -- under the NSA and partner agencies' watchful eyes, according to his estimates, there is too
much data being collected. "That's why they couldn't stop the Boston bombing, or
the Paris shootings, because the data was all there," said Binney. Because the agency isn't
carefully and methodically setting its tools up for smart data collection, that
leaves analysts to search for a needle in a haystack. "The data was all there... the NSA is great at going
back over it forensically for years to see what they were doing before that," he said. "But that doesn't stop it." Binney called this
a "bulk data failure" -- in that the NSA programs, leaked by Edward Snowden, are collecting too
much for the agency to process. He said the problem runs deeper across law enforcement and other federal agencies,
that
like the FBI, the CIA, and the Drug Enforcement Administration (DEA), which all have access to NSA intelligence. Binney left the NSA a
month after the September 11 attacks in New York City in 2001, days after controversial counter-terrorism legislation was enacted -- the
Patriot Act -- in the wake of the attacks. Binney stands jaded by his experience leaving the shadowy eavesdropping agency, but impassioned
for the job he once had. He left after a program he helped develop was scrapped three weeks prior to September 11, replaced by a system he
said was more expensive and more intrusive. Snowden said he was inspired by Binney's case, which in part inspired him to leak thousands
the NSA has ramped up its intelligence gathering
mission to indiscriminately "collect it all." Binney said the NSA is today not as interested in phone records -of classified documents to journalists. Since then,
such as who calls whom, when, and for how long. Although the Obama administration calls the program a "critical national security tool,"
the agency is increasingly looking at the content of communications, as the Snowden disclosures have shown. Binney said he estimated that
a "maximum" of 72 companies were participating in the bulk records collection program -- including Verizon, but said it was a drop in the
ocean. He also called PRISM, the clandestine surveillance program that grabs data from nine named Silicon Valley giants, including Apple,
The Upstream program is where
the vast bulk of the information was being collected," said Binney, talking about how
the NSA tapped undersea fiber optic cables. With help from its British counterparts at GCHQ, the NSA
is able to "buffer" more than 21 petabytes a day. Binney said the "collect it all" mantra
now may be the norm, but it's expensive and ineffective. "If you have to collect everything, there's
Google, Facebook, and Microsoft, just a "minor part" of the data collection process. "
an ever increasing need for more and more budget," he said. "That means you can build your empire." They say you never leave the
intelligence community. Once you're a spy, you're always a spy -- it's a job for life, with few exceptions. One of those is blowing the whistle,
which he did. Since then, he has spent his retirement lobbying for change and reform in industry and in Congress. "They're taking away half
of the constitution in secret," said Binney. "If they want to change the constitution, there's a way to do that -- and it's in the constitution."
An NSA spokesperson did not immediately comment.
Turns cyberterror
Mass surveillance collapses the internet and makes cyberterror likely
Bryant, VICE, 1/26 (Ben Bryant, VICE News “Mass Surveillance Does Not Stop Terrorists,
Europe's Top Rights Body Says” January 26, 2015 https://news.vice.com/article/masssurveillance-does-not-stop-terrorists-europes-top-rights-body-says) //GY
Mass surveillance is ineffective in the fight against terrorism, threatens human rights
and violates the privacy enshrined in European law, Europe's top rights body has said. ¶ Among a
raft of non-binding proposals, parliamentary watchdogs should be given the power to approve
intelligence agencies' budgets and whistleblowers should be offered statutory protection, a report
by the assembly of the Council of Europe said.¶ The 35-page document drafted by Dutch
parliamentarian Pieter Omtzigt proposes measures that should be taken by the assembly's 47
European member states before the "industrial-surveillance complex spins out of control." The
assembly, which will now debate the report, provides recommendations to the European Court of
Human Rights which are not legally binding but can be influential. European governments are
free to ignore the assembly's recommendations, but must explain why if they choose to do so.¶ The
report also says that current British laws may be incompatible with the European convention on
human rights, an internationally binding treaty. British surveillance may contradict Article 8, the
right to privacy; Article 10, the right to freedom of expression; and Article 6, the right to a fair
trial.¶ In wake of Paris attacks, David Cameron calls for new powers to break encrypted
communications. Read more here.¶ The assembly has been investigating the question of
surveillance since last year, and in April heard evidence via videolink from Edward Snowden, the
fugitive US National Security Agency whistleblower.¶ Its report was dismissive of the value
of intelligence gleaned from mass surveillance, saying: "We have seen that mass
surveillance is not even effective as a tool in the fight against terrorism and
organised crime, in comparison with traditional targeted surveillance."¶ It does not
specifically mention the recent Paris terrorist attacks in which 17 people were shot dead by
terrorists, however. UK Prime Minister David Cameron has used the Paris shootings to call for
widening surveillance powers, despite admissions from France that the attackers were known to
the authorities, but that they discontinued eavesdropping last summer.¶ Citing independent
US reviews of mass surveillance, the report said "resources that might prevent
attacks are diverted to mass surveillance, leaving potentially dangerous persons
free to act."¶ Some aspects of mass surveillance, such as the deliberate weakening
of encryption, even present "a grave danger for national security" the report said,
because such weaknesses "can be detected and exploited by rogue
states, terrorists cyber-terrorists and ordinary criminals." Cameron has
recently called for new powers to break encrypted communications.¶ UK will ask preschool
teachers to spy on children in latest counter-terror proposals. Read more here.¶ Mass
surveillance threatens "the very existence of the Internet as we know it"
and "nobody and nothing is safe from snooping by our own countries' and even
foreign intelligence services" without technology that safeguards privacy, the
document added.¶ The assembly also sent a letter to the German, British and US authorities
asking if they had circumvented laws restricting domestic spying by getting a third party to do it
for them. The Germans and British deny the accusation, but the US has failed to reply. ¶ The report
concludes that the British response was probably true — because the UK's Data Retention and
Investigatory Powers Act already allows for the wide-ranging collection of personal data.¶ Eric
King, deputy director of privacy NGO Privacy International, told VICE News: "This latest report
highlights what has been said all along: intelligence agencies in the UK are in the business of
mass, indiscriminate surveillance and there are few if any legal safeguards in place to protect
human rights.¶ "It's embarrassing that the British government continues to neither confirm nor
deny the essential facts behind this, limiting the opportunity for debate, limiting the opportunity
for reform, and limiting proper accountability in the courts.¶ "Secret interpretations of secret laws
are plainly not a sustainable position, and place democracy and the rule of law in jeopardy."
Overload negates effectiveness of cybersecurity operations
Conti et al, Director of the Information Technology Operations
Center, ’06 (Gregory Conti, Kulsoom Abdullah,¶ Julian Grizzard, John Stasko,¶ John A.
Copeland, Mustaque Ahamad,¶ Henry L. Owen, and Chris Lee¶ Georgia Institute of Technology
“Countering Security¶ Information¶ Overload through¶ Alert and Packet¶ Visualization”
March/April 2006 Published by the IEEE Computer Society
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1607922) //GY
The massive amount of security data that¶ network sensors and host-based
applications¶ generate can quickly overwhelm the operators¶ charged with
defending the network. Often, operators¶ overlook important details, and it’s
difficult to gain a¶ coherent picture of network health and security status¶ by
manually traversing textual logs, using commandline¶ analysis scripts or traditional graphing and
charting¶ techniques. In many instances, this flood of data¶ will actually reduce the
overall level¶ of security by consuming operators’¶ available time or misdirecting their¶
efforts. In extreme circumstances,¶ the operators will become desensitized¶ and ignore
security warnings¶ altogether, effectively negating the¶ value of their security
systems.
…
Information visualization of security-related data¶ bears great promise in making our personal
computers,¶ servers, and networks more secure. Such work is both an¶ art and a science requiring
expertise from the computer¶ graphics, information visualization, interface design,¶ and security
communities to turn the raw security data¶ into insightful and actionable information and
knowledge.¶ There is no shortage of raw data—in fact there is¶ far more than can be
analyzed by today’s best tools.¶ Humans often cope with this torrent of data by
using¶ crude statistical techniques, textual displays, and outdated¶ graphical
techniques and by ignoring large portions¶ of the data. We believe that security
visualization,¶ at its best, is both compelling as a video game and several¶ orders of magnitude
more effective than the tools¶ we employ today. In this article, we moved toward this goal by
exploring the design, implementation, and evaluation¶ of two complementary systems springing
from¶ immediate, high-priority security needs and developed¶ by an interdisciplinary team of
researchers. By bringing¶ together diverse ideas and expertise, we directly¶ addressed significant
problems facing the people who¶ defend our information technology resources.
New CISA bill fails to effectively stop cyberattacks – gathering more
information distracts officials from fixing structural problems
Castillo 5/7 (Andrea, program manager of the Technology Policy Program for the Mercatus
Center at George Mason University, 5/7/15, “Cybersecurity bill more likely to promote
information overload than prevent cyberattacks,” http://thehill.com/blogs/congressblog/homeland-security/241242-cybersecurity-bill-more-likely-to-promote-information//Tang)
A growing number of information security and hacking incidents emphasize the importance of improving U.S.
cybersecurity practices. But many
computer security experts are concerned that the
is unlikely to meaningfully prevent
cyberattacks as supporters claim. Rather, it will provide another avenue for federal
offices to extract private data without addressing our root cybersecurity
vulnerabilities. The main premise of CISA is that cyber breaches can be prevented by encouraging private
Cybersecurity Information Sharing Act of 2015 (CISA)
companies to share cyber threat data with the government. CISA would extend legal immunity to private entities that
share sensitive information about security vulnerabilities—often containing personally identifiable information (PII)
about users and customers—with federal offices like the Department of Justice (DOJ), Department of Homeland Security
(DHS) and Director of National Intelligence (DNI). This concerns privacy advocates who point out that such data
collection could serve as an alternative surveillance tool for the NSA. Section 5(A) of CISA authorizes federal agencies to
“disclose, retain, and use” shared data for many purposes beyond promoting cybersecurity, like investigating terrorism,
the sexual exploitation of children, violent felonies, fraud, identity theft, and trade secret violation. In other words, CISA
would allow federal agencies to use data obtained under the auspices of “cybersecurity protection” in entirely unrelated
criminal investigations — potentially indefinitely. Indeed, CISA is currently stalled in the Senate in deference to debate
over the NSA’s controversial bulk collection programs. But the Senate cool-down should not let us forget that CISA does
not just threaten civil liberties, it could actually undermine cybersecurity.
Information security experts
point out that existing information sharing measures run by private companies
like IBM and Dell SecureWorks rarely prevent attacks like CISA advocates
promise. One survey of information security professionals finds that 87 percent of responders did not believe
information sharing measures such as CISA will significantly reduce privacy breaches. The federal government already
operates at least 20 information sharing offices collaborating on cybersecurity with the private sector, as Eli Dourado and
I found in our new analysis through the Mercatus Center at George Mason University. These numerous federal
information-sharing initiatives have not stemmed the tidal wave of government
cyberattacks. Another Mercatus Center analysis Dourado and I conducted finds that the number of reported federal
information security failures has increased by an astounding 1,012 percent—from 5,502 in FY 2006 to 61,214 in FY 2013.
Almost 40 percent of these involved the PII of federal employees and civilians. CISA could therefore have the unintended
consequence of creating a juicy and unprepared target for one-stop hacking. The
Office of Management
and Budget reports that many of the federal agencies that would be given large
data management responsibilities through CISA, like the DOJ and DHS, reported
thousands of such breaches in FY 2014. These agencies’ own information security systems are unlikely
to become miraculously impervious to external hacking upon CISA’s passing. In fact, the massive amounts of
new data to manage could further overwhelm currently suboptimal practices. The
federal government’s information security failures indicate a technocratic
mindset that falsely equates the complexity of bureaucracy with the strength of a
solution. In reality, the government’s brittle and redundant internal cybersecurity
policies actively contribute to their security challenges. The Government Accountability Office
(GAO) has reported for years that such overlapping and unclear responsibility in federal
cybersecurity policy limits the offices’ ultimate effectiveness. A 2015 GAO investigation
concludes that without significant change “the nation’s most critical federal and
private sector infrastructure systems will remain at increased risk of attack from
adversaries.” The federal government must get its own house in order before such
comprehensive information sharing measures like CISA could be even technically
feasible. But CISA would be a failure even if managed by the most well-managed government systems because it seeks
to impose a technocratic structure on a dynamic system. Effective reform will promote a selforganizing “collaborative security approach” as outlined by groups like the Internet Society, an
international nonprofit devoted to Internet policy and technology standards. Cybersecurity provision is too important a
problem to be inadequately addressed by measures that will fail to improve security.
Turns terror
Overload makes terrorist attacks more likely to go unprevented
Eddington, 1/27 (Patrick Eddington, CATO institute, “No, Mass Surveillance Won't Stop
Terrorist Attacks” January 27, 2015 http://reason.com/archives/2015/01/27/mass-surveillanceand-terrorism#.19hszl:U8Io) //GY
The recent terrorist attack on the office of French satirical magazine Charlie Hebdo generated a
now-familiar meme: Another terrorist attack means we need more surveillance.¶ Sen. Bob Corker
(R-Tenn.) said that while "Congress having oversight certainly is important ... what is more
important relative to these types of events is ensuring we don't overly hamstring
the NSA's ability to collect this kind of information in advance and keep these
kinds of activities from occurring." Similarly, Sen. Lindsey Graham (R-S.C.) spoke of his
"fear" that "our intelligence capabilities, those designed to prevent such an attack from taking
place on our shores, are quickly eroding," adding that the government surveillance "designed to
prevent these types of attacks from occurring is under siege."¶ A recent poll demonstrates that
their sentiments are widely shared in the wake of the attack.¶ But would more mass surveillance
have prevented the assault on the Charlie Hebdo office? Events from 9/11 to the present help
provide the answer:¶ 2009: Umar Farouk Abdulmutallab—i.e., the "underwear bomber"—nearly
succeeded in downing the airline he was on over Detroit because, according to then-National
Counterterrorism Center (NCC) director Michael Leiter, the federal Intelligence Community (IC)
failed "to connect, integrate, and fully understand the intelligence" it had collected.¶ 2009: Army
Major Nidal Hasan was able to conduct his deadly, Anwar al-Awlaki-inspired rampage at Ft.
Hood, Texas, because the FBI bungled its Hasan investigation.¶ 2013: The Boston Marathon
bombing happened, at least in part, because the CIA, Department of Homeland Security (DHS),
FBI, NCC, and National Security Agency (NSA) failed to properly coordinate and share
information about Tamerlan Tsarnaev and his family, associations, and travel to and from Russia
in 2012. Those failures were detailed in a 2014 report prepared by the Inspectors General of the
IC, Department of Justice, CIA, and DHS.¶ 2014: The Charlie Hebdo and French grocery store
attackers were not only known to French and U.S. authorities but one had a prior terrorism
conviction and another was monitored for years by French authorities until less than a year before
the attack on the magazine.¶ No, mass surveillance does not prevent terrorist attacks.¶
It’s worth remembering that the mass surveillance programs initiated by the U.S.
government after the 9/11 attacks—the legal ones and the constitutionally-dubious ones—
were premised on the belief that bin Laden’s hijacker-terrorists were able to pull
off the attacks because of a failure to collect enough data. Yet in their subsequent
reports on the attacks, the Congressional Joint Inquiry (2002) and the 9/11 Commission found
exactly the opposite. The data to detect (and thus foil) the plots was in the U.S.
government’s hands prior to the attacks; the failures were ones of sharing,
analysis, and dissemination. That malady perfectly describes every intelligence
failure from Pearl Harbor to the present day.¶ The Office of the Director of National
Intelligence (created by Congress in 2004) was supposed to be the answer to the "failure-toconnect-the-dots" problem. Ten years on, the problem remains, the IC bureaucracy is
bigger than ever, and our government is continuing to rely on mass surveillance
programs that have failed time and again to stop terrorists while simultaneously
undermining the civil liberties and personal privacy of every American. The quest
to "collect it all," to borrow a phrase from NSA Director Keith Alexander, only
leads to the accumulation of masses of useless information, making it harder to
find real threats and costing billions to store.¶ A recent Guardian editorial noted that
such mass-surveillance myopia is spreading among European political leaders as well,
despite the fact that "terrorists, from 9/11 to the Woolwich jihadists and the neo-Nazi Anders
Breivik, have almost always come to the authorities’ attention before murdering." ¶ Mass
surveillance is not only destructive of our liberties, its continued use is a virtual
guarantee of more lethal intelligence failures. And our continued will to
disbelieve those facts is a mental dodge we engage in at our peril.
Overload of data makes terrorism prevention impossible
Tufekci, assistant professor UNC, 2/3 (Zeynep Tufekci, assistant professor at the
University of North Carolina, “Terror and the limits of mass surveillance” Feb 03, 2015
http://blogs.ft.com/the-exchange/2015/02/03/zeynep-tufekci-terror-and-the-limits-of-masssurveillance/) //GY
But the assertion that big data is “what it’s all about” when it comes to predicting
rare events is not supported by what we know about how these methods work,
and more importantly, don’t work. Analytics on massive datasets can be powerful in
analysing and identifying broad patterns, or events that occur regularly and frequently, but are
singularly unsuited to finding unpredictable, erratic, and rare needles in huge
haystacks. In fact, the bigger the haystack — the more massive the scale and the wider the
scope of the surveillance — the less suited these methods are to finding such
exceptional events, and the more they may serve to direct resources and attention
away from appropriate tools and methods.¶ After Rigby was killed, GCHQ, Britain’s
intelligence service, was criticised by many for failing to stop his killers, Michael Adebolajo and
Michael Adebowale. A lengthy parliamentary inquiry was conducted, resulting in a 192-page
report that lists all the ways in which Adebolajo and Adebowale had brushes with data
surveillance, but were not flagged as two men who were about to kill a soldier on a London street.
GCHQ defended itself by saying that some of the crucial online exchanges had taken place on a
platform, believed to be Facebook, which had not alerted the agency about these men, or the
nature of their postings. The men apparently had numerous exchanges that were extremist in
nature, and their accounts were suspended repeatedly by the platform for violating its terms of
service.¶ “If only Facebook had turned over more data,” the thinking goes.¶ But that is
misleading, and makes sense only with the benefit of hindsight. Seeking larger
volumes of data, such as asking Facebook to alert intelligence agencies every time that it
detects a post containing violence, would deluge the agencies with multiple false leads that
would lead to a data quagmire, rather than clues to impending crimes.¶ For big data
analytics to work, there needs to be a reliable connection between the signal (posting
of violent content) and the event (killing someone). Otherwise, the signal is worse
than useless. Millions of Facebook’s billion-plus users post violent content every day, ranging
from routinised movie violence to atrocious violent rhetoric. Turning over the data from all
such occurrences would merely flood the agencies with “false positives” —
erroneous indications for events that actually will not happen. Such data overload
is not without cost, as it takes time and effort to sift through these millions of
strands of hay to confirm that they are, indeed, not needles — especially when we
don’t even know what needles look like. All that the investigators would have
would be a lot of open leads with no resolution, taking away resources from any
real investigation. Besides, account suspensions carried out by platforms like Facebook’s are
haphazard, semi-automated and unreliable indicators. The flagging system misses a lot
more violent content than it flags, and it often flags content as inappropriate even
when it is not, and suffers from many biases. Relying on such a haphazard system is not
a reasonable path at all.¶ So is all the hype around big data analytics unjustified? Yes and no.
There are appropriate use cases for which massive datasets are intensely useful, and perform
much better than any alternative we can imagine using conventional methods. Successful
examples include using Google searches to figure out drug interactions that would be too complex
and too numerous to analyse one clinical trial at a time, or using social media to detect nationallevel swings in our mood (we are indeed happier on Fridays than on Mondays).
Overload makes lone wolf terror prevention ineffective
Tufekci, assistant professor UNC, 2/3 (Zeynep Tufekci, assistant professor at the
University of North Carolina, “Terror and the limits of mass surveillance” Feb 03, 2015
http://blogs.ft.com/the-exchange/2015/02/03/zeynep-tufekci-terror-and-the-limits-of-masssurveillance/) //GY
In contrast, consider the “lone wolf” attacker who took hostages at, of all things, a “Lindt
Chocolat Café” in Sydney. Chocolate shops are not regular targets of political violence,
and random, crazed men attacking them is not a pattern on which we can base further
identification. Yes, the Sydney attacker claimed jihadi ideology and brought a black flag with
Islamic writing on it, but given the rarity of such events, it’s not always possible to separate the
jihadi rhetoric from issues of mental health — every era’s mentally ill are affected by the cultural
patterns around them. This isn’t a job for big data analytics. (The fact that the gunman was
on bail facing various charges and was known for sending hate letters to the families of Australian
soldiers killed overseas suggests it was a job for traditional policing). ¶ When confronted with
their failures in predicting those rare acts of domestic terrorism, here’s what GCHQ,
and indeed the NSA, should have said instead of asking for increased surveillance
capabilities: stop asking us to collect more and more data to perform an
impossible task. This glut of data is making our job harder, not easier, and the
expectation that there will never be such incidents, ever, is not realistic.¶ Attention
should instead be focused on the causal chain that led the Kouachi brothers on their path. It
seems that the French-born duo had an alienated, turbulent youth, and then spent years in
French prisons, where they were transformed from confused and incompetent wannabe jihadis to
hardliners who were both committed and a lot more capable of carrying out complex violence acts
than when they entered the prison. Understanding such paths will almost certainly be
more productive for preventing such events, and will also spare all of us from
another real danger: governments that know too much about their citizens, and a
misguided belief in what big data can do to find needles in too-large haystacks.
Mass data mining makes terror prevention impossible
Schneier, 3/24 (Bruce Schneier Advisory Board Member of the Electronic Privacy
Information Center, “Why Mass Surveillance Can't, Won't, And Never Has Stopped A Terrorist”
Mar 24 2015, 2:15 AM http://digg.com/2015/why-mass-surveillance-cant-wont-and-never-hasstopped-a-terrorist) //GY
Data mining is offered as the technique that will enable us to connect those dots. But while
corporations are successfully mining our personal data in order to target advertising, detect
financial fraud, and perform other tasks, three critical issues make data mining an
inappropriate tool for finding terrorists.¶ The first, and most important, issue is
error rates. For advertising, data mining can be successful even with a large error
rate, but finding terrorists requires a much higher degree of accuracy than datamining systems can possibly provide.¶ Data mining works best when you’re searching for a
well-defined profile, when there are a reasonable number of events per year, and when the cost of
false alarms is low. Detecting credit card fraud is one of data mining’s security success stories: all
credit card companies mine their transaction databases for spending patterns that indicate a
stolen card. There are over a billion active credit cards in circulation in the United States, and
nearly 8% of those are fraudulently used each year. Many credit card thefts share a pattern —
purchases in locations not normally frequented by the cardholder, and purchases of travel, luxury
goods, and easily fenced items — and in many cases data-mining systems can minimize the losses
by preventing fraudulent transactions. The only cost of a false alarm is a phone call to the
cardholder asking her to verify a couple of her purchases.¶ Similarly, the IRS uses data mining to
identify tax evaders, the police use it to predict crime hot spots, and banks use it to predict loan
defaults. These applications have had mixed success, based on the data and the application, but
they’re all within the scope of what data mining can accomplish.¶ Terrorist plots are
different, mostly because whereas fraud is common, terrorist attacks are very
rare. This means that even highly accurate terrorism prediction systems will be so
flooded with false alarms that they will be useless.¶ The reason lies in the
mathematics of detection. All detection systems have errors, and system
designers can tune them to minimize either false positives or false negatives. In a
terrorist-detection system, a false positive occurs when the system mistakenly
identifies something harmless as a threat. A false negative occurs when the system misses
an actual attack. Depending on how you “tune” your detection system, you can increase the
number of false positives to assure you are less likely to miss an attack, or you can reduce the
number of false positives at the expense of missing attacks.¶ Because terrorist attacks
are so rare, false positives completely overwhelm the system, no matter how well
you tune. And I mean completely: millions of people will be falsely accused for every
real terrorist plot the system finds, if it ever finds any.¶ We might be able to deal with all of the
innocents being flagged by the system if the cost of false positives were minor. Think about the
full-body scanners at airports. Those alert all the time when scanning people. But a TSA officer
can easily check for a false alarm with a simple pat-down. This doesn’t work for a more general
data-based terrorism-detection system. Each alert requires a lengthy investigation to determine
whether it’s real or not. That takes time and money, and prevents intelligence officers from doing
other productive work. Or, more pithily, when you’re watching everything, you’re not
seeing anything.¶ The US intelligence community also likens finding a terrorist plot to looking
for a needle in a haystack. And, as former NSA director General Keith Alexander said, “you need
the haystack to find the needle.” That statement perfectly illustrates the problem with mass
surveillance and bulk collection. When you’re looking for the needle, the last thing you
want to do is pile lots more hay on it. More specifically, there is no scientific
rationale for believing that adding irrelevant data about innocent people makes it
easier to find a terrorist attack, and lots of evidence that it does not. You might be
adding slightly more signal, but you’re also adding much more noise. And despite the NSA’s
“collect it all” mentality, its own documents bear this out. The military intelligence community
even talks about the problem of “drinking from a fire hose”: having so much irrelevant data that
it’s impossible to find the important bits. We saw this problem with the NSA’s
eavesdropping program: the false positives overwhelmed the system. In the years
after 9/11, the NSA passed to the FBI thousands of tips per month; every one of
them turned out to be a false alarm. The cost was enormous, and ended up frustrating the
FBI agents who were obligated to investigate all the tips. We also saw this with the Suspicious
Activity Reports —or SAR — database: tens of thousands of reports, and no actual results. And all
the telephone metadata the NSA collected led to just one success: the conviction of a taxi driver
who sent $8,500 to a Somali group that posed no direct threat to the US — and that was probably
trumped up so the NSA would have better talking points in front of Congress. ¶ The second
problem with using data-mining techniques to try to uncover terrorist plots is
that each attack is unique. Who would have guessed that two pressure-cooker bombs would
be delivered to the Boston Marathon finish line in backpacks by a Boston college kid and his older
brother? Each rare individual who carries out a terrorist attack will have a disproportionate
impact on the criteria used to decide who’s a likely terrorist, leading to ineffective detection
strategies.¶ The third problem is that the people the NSA is trying to find are wily,
and they’re trying to avoid detection. In the world of personalized marketing, the
typical surveillance subject isn’t trying to hide his activities. That is not true in a police
or national security context. An adversarial relationship makes the problem much harder, and
means that most commercial big data analysis tools just don’t work. A commercial tool can simply
ignore people trying to hide and assume benign behavior on the part of everyone else.
Government data-mining techniques can’t do that, because those are the very people they’re
looking for.
Data overload risks terror attacks – whistleblowers confirm
Whittaker, 15
Zack Whittaker is a writer-editor for ZDNet, and sister sites CNET and CBS News, citing an NSA
whistleblower, “NSA is so overwhelmed with data, it's no longer effective, says whistleblower,”
ZDNet, 4/30/15, http://www.zdnet.com/article/nsa-whistleblower-overwhelmed-with-dataineffective/?tag=nl.e539&s_cid=e539&ttag=e539&ftag=TRE17cfd61 // IS
In a lunch meeting hosted by Contrast Security founder Jeff Williams on Wednesday, William
Binney, a former NSA official who spent more than three decades at the agency,
said the US government's mass surveillance programs have become so
engorged with data that they are no longer effective, losing vital
intelligence in the fray .
That, he said, can -- and has -- led to terrorist attacks succeeding.
Binney said that an analyst today can run one simple query across the NSA's
various databases, only to become immediately overloaded with information.
With about four billion people -- around two-thirds of the world's population -under the NSA and partner agencies' watchful eyes, according to his estimates,
there is too much data being collected.
"That's why they couldn't stop the
Boston bombing, or the Paris shootings,
because the data was all there," said Binney. Because the agency isn't carefully and
methodically setting its tools up for smart data collection, that leaves analysts to
search for a needle in a haystack.
"The data was all there... the NSA is great at going back over it forensically for years to see what
they were doing before that," he said. "But that doesn't stop it."
Binney called this a "bulk data failure" -- in that the NSA programs, leaked by Edward
Snowden, are collecting too much for the agency to process. He said the problem runs
deeper across law enforcement and other federal agencies, like the FBI, the CIA, and the Drug
Enforcement Administration (DEA), which all have access to NSA intelligence.
than government or law can keep up."
Overload kills attempts to stop Al Qaeda attacks
Robb, Air Force analyst, ’06 (John Robb, Air Force analyst, “NSA: The Problems with
Massively Automated Domestic Surveillance” May 11, 2006
http://globalguerrillas.typepad.com/johnrobb/2006/05/nsa_the_problem.html) //GY
Noah, at DefenseTech, tapped Valdis Krebs for his analysis of the problems with the slowly leaked
details on the NSAs domestic surveillance efforts. Valdis makes the absolutely correct observation
that:
The right thing to do is to look for the best haystack, not the biggest haystack. We
knew exactly which haystack to look at in the year 2000 [before the 9/11 attacks].
We just didn't do it...
To me, it's pretty clear that the people working on this program aren't as smart as they think they
are. Some top level thinking indicates that this will quickly become a rat hole for
federal funds (due to wasted effort) and a major source of infringement of
personal freedom. Here's some detail:
It will generate oodles of false positives. Al Qaeda is now in a phase where most
domestic attacks will be generated by people not currently connected to the
movement (like we saw in the London bombings). This means that in many respects they will
look like you and me until they act. The large volume of false positives generated will
not only be hugely inefficient, it will be a major infringement on US liberties. For
example, a false positive will likely get you automatically added to a no-fly list, your boss may be
visited (which will cause you to lose your job), etc.
It will be expanded to include to monitor domestic groups other than al Qaeda. As
we have already seen in numerous incidents across the US, every group that opposes the war or
deals with issues in the Middle East will eventually fall under surveillance. Eventually , this will
begin to bump up the political process by targeting groups that are politically
active in the opposition party.
The database and associated information will be used for purposes other than
tracking groups. For example: finding who leaked a classified document to a reporter by
reading the list of all calls made to that reporter (who is likely on the target list due to the subjects
they cover).
Info overload creates redundancy – lack of info sharing makes it
impossible to stop attacks
Priest and Arkin 10 (Dana Priest, American academic, journalist and writer, Washington
Post, William Arkin, American political commentator, best-selling author, journalist, activist,
blogger, and former United States Army soldier, 7/19, “Top Secret America,”
http://projects.washingtonpost.com/top-secret-america/articles/a-hidden-world-growingbeyond-control/print///Tang)
And then came a problem that continues to this day, which has to do with the ODNI's rapid expansion. When it opened in the spring of
2005, Negroponte's office was all of 11 people stuffed into a secure vault with closet-size rooms a block from the White House. A year later,
the budding agency moved to two floors of another building. In April 2008, it moved into its huge permanent home, Liberty Crossing.
Today, many officials who work in the intelligence agencies say they remain unclear about what the ODNI is in charge of. To be sure, the
ODNI has made some progress, especially in intelligence-sharing, information technology and budget reform. The DNI and his managers
hold interagency meetings every day to promote collaboration. The last director, Blair, doggedly pursued such nitty-gritty issues as
improvements have been
overtaken by volume at the ODNI, as the increased flow of intelligence data
overwhelms the system's ability to analyze and use it. Every day, collection
systems at the National Security Agency intercept and store 1.7 billion e-mails,
phone calls and other types of communications. The NSA sorts a fraction of those into 70 separate
databases. The same problem bedevils every other intelligence agency, none of which
have enough analysts and translators for all this work. The practical effect of this
unwieldiness is visible, on a much smaller scale, in the office of Michael Leiter, the director of the National
procurement reform, compatible computer networks, tradecraft standards and collegiality. But
Counterterrorism Center. Leiter spends much of his day flipping among four computer monitors lined up on his desk. Six hard drives sit at
his feet. The data flow is enormous, with dozens of databases feeding separate computer networks that cannot interact with one another.
There is a long explanation for why these databases are still not connected, and it amounts to this: It's too hard, and some agency heads
don't really want to give up the systems they have. But there's some progress: "All my e-mail on one computer now," Leiter says. "That's a
big deal." To get another view of how sprawling Top Secret America has become, just head west on the toll road toward Dulles International
Airport. As a Michaels craft store and a Books-A-Million give way to the military intelligence giants Northrop Grumman and Lockheed
Martin, find the off-ramp and turn left. Those two shimmering-blue five-story ice cubes belong to the National Geospatial-Intelligence
Agency, which analyzes images and mapping data of the Earth's geography. A small sign obscured by a boxwood hedge says so. Across the
street, in the chocolate-brown blocks, is Carahsoft, an intelligence agency contractor specializing in mapping, speech analysis and data
harvesting. Nearby is the government's Underground Facility Analysis Center. It identifies overseas underground command centers
associated with weapons of mass destruction and terrorist groups, and advises the military on how to destroy them. Clusters of top-secret
work exist throughout the country, but the Washington region is the capital of Top Secret America. About half of the post-9/11 enterprise is
anchored in an arc stretching from Leesburg south to Quantico, back north through Washington and curving northeast to Linthicum, just
north of the Baltimore-Washington International Marshall Airport. Many buildings sit within off-limits government compounds or military
bases. Others occupy business parks or are intermingled with neighborhoods, schools and shopping centers and go unnoticed by most
people who live or play nearby. Many of the newest buildings are not just utilitarian offices but also edifices "on the order of the pyramids,"
in the words of one senior military intelligence officer. Not far from the Dulles Toll Road, the CIA has expanded into two buildings that will
increase the agency's office space by one-third. To the south, Springfield is becoming home to the new $1.8 billion National GeospatialIntelligence Agency headquarters, which will be the fourth-largest federal building in the area and home to 8,500 employees. Economic
It's not only the
number of buildings that suggests the size and cost of this expansion, it's also
what is inside: banks of television monitors. "Escort-required" badges. X-ray
machines and lockers to store cellphones and pagers. Keypad door locks that open special rooms
stimulus money is paying hundreds of millions of dollars for this kind of federal construction across the region.
encased in metal or permanent dry wall, impenetrable to eavesdropping tools and protected by alarms and a security force capable of
responding within 15 minutes. Every one of these buildings has at least one of these rooms, known as a SCIF, for sensitive compartmented
information facility. Some are as small as a closet; others are four times the size of a football field. SCIF size has become a measure of status
in Top Secret America, or at least in the Washington region of it. "In D.C., everyone talks SCIF, SCIF, SCIF," said Bruce Paquin, who moved
to Florida from the Washington region several years ago to start a SCIF construction business. "They've got the penis envy thing going. You
can't be a big boy unless you're a three-letter agency and you have a big SCIF." SCIFs are not the only must-have items people pay attention
to. Command centers, internal television networks, video walls, armored SUVs and personal security guards have also become the bling of
national security. "You can't find a four-star general without a security detail," said one three-star general now posted in Washington after
years abroad. "Fear has caused everyone to have stuff. Then comes, 'If he has one, then I have to have one.' It's become a status symbol."
Among the most important people inside the SCIFs are the low-paid employees carrying their lunches to work to save money. They are the
analysts, the 20- and 30-year-olds making $41,000 to $65,000 a year, whose job is at the core of everything Top Secret America tries to do.
At its best, analysis melds cultural understanding with snippets of conversations, coded dialogue, anonymous tips, even scraps of trash,
turning them into clues that lead to individuals and groups trying to harm the United States. Their work is greatly enhanced by computers
in the end, analysis requires human judgment, and half
the analysts are relatively inexperienced, having been hired in the past several
years, said a senior ODNI official. Contract analysts are often straight out of college and trained at corporate headquarters. When hired,
that sort through and categorize data. But
a typical analyst knows very little about the priority countries - Iraq, Iran, Afghanistan and Pakistan - and is not fluent in their languages.
Still, the number of intelligence reports they produce on these key countries is overwhelming, say current and former intelligence officials
who try to cull them every day. The ODNI doesn't know exactly how many reports are issued each year, but in the process of trying to find
out, the chief of analysis discovered 60 classified analytic Web sites still in operation that were supposed to have been closed down for lack
of usefulness. "Like a zombie, it keeps on living" is how one official describes the sites.
The problem with many
intelligence reports, say officers who read them, is that they simply re-slice the same facts
already in circulation. "It's the soccer ball syndrome. Something happens, and they want to
rush to cover it," said Richard H. Immerman, who was the ODNI's assistant deputy director of national intelligence for analytic
integrity and standards until early 2009. "I saw tremendous overlap." Even the analysts at the National Counterterrorism
Center (NCTC), which is supposed to be where the most sensitive, most difficult-to-obtain nuggets of information are fused together, get
low marks from intelligence officials for not producing reports that are original, or at least better than the reports already written by the
CIA, FBI, National Security Agency or Defense Intelligence Agency. When Maj. Gen. John M. Custer was the director of intelligence at U.S.
Central Command, he grew angry at how little helpful information came out of the NCTC. In 2007, he visited its director at the time, retired
Vice Adm. John Scott Redd, to tell him so. "I told him that after 41/2 years, this organization had never produced one shred of information
that helped me prosecute three wars!" he said loudly, leaning over the table during an interview. Two years later, Custer, now head of the
Army's intelligence school at Fort Huachuca, Ariz., still gets red-faced recalling that day, which reminds him of his frustration with
Who has the mission of reducing redundancy and ensuring
everybody doesn't gravitate to the lowest-hanging fruit?" he said. "Who orchestrates
what is produced so that everybody doesn't produce the same thing?" He's hardly the only
Washington's bureaucracy. "
one irritated. In a secure office in Washington, a senior intelligence officer was dealing with his own frustration. Seated at his computer, he
began scrolling through some of the classified information he is expected to read every day: CIA World Intelligence Review, WIRe-CIA, Spot
Intelligence Report, Daily Intelligence Summary, Weekly Intelligence Forecast, Weekly Warning Forecast, IC Terrorist Threat Assessments,
NCTC Terrorism Dispatch, NCTC Spotlight . . . It's too much, he complained. The inbox on his desk was full, too. He threw up his arms,
picked up a thick, glossy intelligence report and waved it around, yelling. "Jesus! Why does it take so long to produce?" "Why does it have to
The overload of hourly, daily, weekly, monthly and annual
reports is actually counterproductive, say people who receive them. Some policymakers and
senior officials don't dare delve into the backup clogging their computers. They
rely instead on personal briefers, and those briefers usually rely on their own
agency's analysis, re-creating the very problem identified as a main cause of the
failure to thwart the attacks: a lack of information-sharing.
be so bulky?" "Why isn't it online?"
More surveillance fails to prevent terror attacks – multiple examples
Marlowe 10 (Lara, Paris Correspondent with The Irish Times., citing Top Secret America
report, 7/24, “Information overload threatening to choke response to terror.”
http://www.irishtimes.com/news/information-overload-threatening-to-choke-response-toterror-1.626474//Tang)
A report on the colossal counter-terrorism intelligence industry in the US shows that it may be
drowning in an ocean of raw data THIS, I suspect, is how empires die: over-extended,
asphyxiated by bureaucracy, drowning in information they cannot adequately
assess or act upon. The Washington Post published a stunning, three-day series totalling 11 pages
this week on “Top Secret America”. It was the result of an investigation over two years
by Dana Priest and William Arkin into the explosion of the intelligence industry since September 11th, 2001. Consider the
statistics: 1,271
government organisations and 1,931 private companies are now
devoted to counter-terrorism, “homeland security” and intelligence, in 10,000
locations across the US. An estimated 854,000 Americans – 1.5 times the population of Washington DC – hold
top secret security clearances. Nearly one-third of them are private contractors. About half of Top Secret America is
concentrated in a swathe of land running diagonally from Virginia to the southwest, across Washington DC and into
Maryland to the northeast. In
the Washington area alone, 33 top-secret building
complexes, some of them unmarked and windowless behind high fences, have
been or are being built since 9/11. They total 1.6 million sq m (17 million sq ft), the equivalent of 22 US
Capitol buildings. Turf battles between intelligence agencies, the habit of holding information close to
the chest and the impossibility of co-ordinating so much activity makes for huge
amounts of duplication. For example, 51 federal organisations and military commands
are dedicated to tracking the money of terrorists. The volume of reporting generated by Top
Secret America – 50,000 intelligence reports each year – means no one has a full grasp
of what is known. As James Clapper, President Obama’s nominee for director of national intelligence, told the
Post: “There’s only one entity in the entire universe that has visibility on all (top secret programmes) – that’s God.” “The
complexity of this system defies description,” said another high-ranking source, retired army Lt Gen John Vines,
commissioned to track intelligence at the Department of Defence. The Postconcluded that despite a 250 per cent increase
in intelligence spending since 9/11, despite the creation or restructuring of 263 organisations, “the problems that gusher of
money and bureaucracy were meant to solve . . . have not been alleviated”. Agencies
are still failing to
share information or “connect the dots”. America may not be measurably safer for the more than $75
billion (€58 billion) it spends each year on intelligence. The National Security Agency intercepts and stores 1.7 billion emails, phone calls and other communications daily. But the NSA and other agencies doing similar work don’t have enough
analysts and translators to process the information they cull. One could argue that the absence of large-scale, lethal attacks
on the US continent since 9/11 shows the system is working. But three
recent cases show how Top
Secret America failed to forestall real threats. Last November, US army Maj Nidal
Hasan went on a shooting rampage at Fort Hood Texas, killing 13 people and
wounding 30 others. When he was training as a psychiatrist at Walter Reed Army
Medical Centre, Hasan had warned his superiors of “adverse events” if Muslims
were not allowed to leave the army. And he exchanged e-mails with Anwar
Awlaki, a radical cleric based in Yemen whom the US has targeted for
assassination. But the army’s intelligence unit did not notice Hasan’s behaviour. Its
programme, called RITA for Radical Islamic Threat to the Army, was too busy replicating work
by the Department of Homeland Security and FBI on Islamist student groups in
the US. Last autumn, President Obama signed a secret order to send dozens of
commandos to Yemen, where they set up an intelligence centre bristling with hitech equipment. Their voluminous reports were bundled into the 5,000 pieces of
data sent daily to the National Counter-terrorism Centre in Washington. Buried
in the deluge was the news that a radical Nigerian student had visited Yemen,
that a Nigerian father was worried about his son who’d gone to Yemen. But when
Umar Farouk Abdulmutallab tried to blow himself up on a flight to Detroit on
Christmas Day, the aircraft was saved by a passenger who saw smoke coming
from Abdulmutallab’s underwear and tackled him, preventing him from detonating the device.
Likewise, it was a vendor in Manhattan who alerted police to a home-made car
bomb on Times Square at the beginning of May. Faisal Shahzad, the Pakistani-born
American citizen who concocted the mix of fertiliser and bleach, was also in contact with Anwar
Awlaki. The Postreports that analysts working on the “priority countries” of Iraq, Iran,
Afghanistan and Pakistan know little about them and do not speak their
languages, yet produce an “overwhelming” number of reports. The many-tentacled
intelligence community in the US seems blighted by two of the same woes as US
journalism: the same information is rehashed over and over, and recipients are
powerless to sift through the glut of material.
Info overload is counterproductive to counterterror efforts –
cognitive burden
Maas 5/28 (Peter, written about war, media, and national security for The New York Times
Magazine, The New Yorker, and The Washington Post. 5/28/15, “INSIDE NSA, OFFICIALS
PRIVATELY CRITICIZE “COLLECT IT ALL” SURVEILLANCE,”
https://firstlook.org/theintercept/2015/05/28/nsa-officials-privately-criticize-collect-it-allsurveillance//Tang)
AS MEMBERS OF CONGRESS struggle to agree on which surveillance programs to re-authorize before the Patriot Act
expires, they might consider the unusual advice of an
intelligence analyst at the National Security
Agency who warned about the danger of collecting too much data. Imagine, the analyst
wrote in a leaked document, that you are standing in a shopping aisle trying to decide
between jam, jelly or fruit spread, which size, sugar-free or not, generic or
Smucker’s. It can be paralyzing. “We in the agency are at risk of a similar,
collective paralysis in the face of a dizzying array of choices every single day,” the
analyst wrote in 2011. “’Analysis paralysis’ isn’t only a cute rhyme. It’s the term for what
happens when you spend so much time analyzing a situation that you ultimately
stymie any outcome …. It’s what happens in SIGINT [signals intelligence] when we
have access to endless possibilities, but we struggle to prioritize, narrow, and
exploit the best ones.” The document is one of about a dozen in which NSA intelligence experts express concerns
usually heard from the agency’s critics: that the U.S. government’s “collect it all” strategy can
undermine the effort to fight terrorism. The documents, provided to The Intercept by NSA
whistleblower Edward Snowden, appear to contradict years of statements from senior officials who have claimed that
pervasive surveillance of global communications helps the government identify terrorists before they strike or quickly find
them after an attack. The Patriot Act, portions of which expire on Sunday, has been used since 2001 to conduct a number
of dragnet surveillance programs, including the bulk collection of phone metadata from American companies. But the
documents suggest that analysts at the NSA have drowned in data since 9/11,
making it more difficult for them to find the real threats. The titles of the documents capture
their overall message: “Data Is Not Intelligence,” “The Fallacies Behind the Scenes,” “Cognitive Overflow?” “Summit
Fever” and “In Praise of Not Knowing.” Other titles include “Dealing With a ‘Tsunami’ of Intercept” and “Overcome by
Overload?” The documents are not uniform in their positions. Some acknowledge the overload problem but say the agency
is adjusting well. They do not specifically mention the Patriot Act, just the larger dilemma of cutting through a flood of
incoming data. But in
an apparent sign of the scale of the problem, the documents
confirm that the NSA even has a special category of programs that is called
“Coping With Information Overload.” The jam vs. jelly document, titled “Too
Many Choices,” started off in a colorful way but ended with a fairly stark warning:
“The SIGINT mission is far too vital to unnecessarily expand the haystacks while
we search for the needles. Prioritization is key.” These doubts are infrequently heard from officials
inside the NSA. These documents are a window into the private thinking of mid-level officials who are almost never
permitted to discuss their concerns in public. AN AMUSING PARABLE circulated at the NSA a few years ago. Two people
go to a farm and purchase a truckload of melons for a dollar each. They then sell the melons along a busy road for the
same price, a dollar. As they drive back to the farm for another load, they realize they aren’t making a profit, so one of
them suggests, “Do you think we need a bigger truck?” The parable was written by an intelligence analyst in a document
dated Jan. 23, 2012 that was titled, “Do We Need a Bigger SIGINT Truck?” It expresses, in a lively fashion, a critique of the
agency’s effort to collect what former NSA Director Keith Alexander referred to as “the whole haystack.” The
critique
goes to the heart of the agency’s drive to gather as much of the world’s
communications as possible: because it may not find what it needs in a partial
haystack of data, the haystack is expanded as much as possible, on the
assumption that more data will eventually yield useful information. “THE
PROBLEM IS THAT WHEN YOU COLLECT IT ALL, WHEN YOU MONITOR
EVERYONE, YOU UNDERSTAND NOTHING.” –EDWARD SNOWDEN The Snowden files show
that in practice, it doesn’t turn out that way: more is not necessarily better, and in fact, extreme
volume creates its own challenges. “Recently I tried to answer what seemed like a relatively
straightforward question about which telephony metadata collection capabilities are the most important in case we need
to shut something off when the metadata coffers get full,” wrote the intelligence analyst. “By the end of the day, I felt like
capitulating with the white flag of, ‘We need COLOSSAL data storage so we don’t have to worry about it,’ (aka we need a
bigger SIGINT truck).” The analyst added, “Without metrics, how do we know that we have improved something or made
it worse? There’s a running joke … that we’ll only know if collection is important by shutting it off and seeing if someone
screams.” Another document, while not mentioning the dangers of collecting too much data, expressed concerns about
pursuing entrenched but unproductive programs. “How many times have you been watching a terrible movie, only to
convince yourself to stick it out to the end and find out what happens, since you’ve already invested too much time or
money to simply walk away?” the document asked. “This ‘gone too far to stop now’ mentality is our built-in mechanism to
help us allocate and ration resources. However, it
can work to our detriment in prioritizing and
deciding which projects or efforts are worth further expenditure of resources,
regardless of how much has already been ‘sunk.’ As has been said before, insanity is doing the
same thing over and over and expecting different results.” “WE ARE DROWNING IN
INFORMATION. AND YET WE KNOW NOTHING. FOR SURE.” –NSA INTELLIGENCE
ANALYST Many of these documents were written by intelligence analysts who had regular columns distributed on
NSANet, the agency’s intranet. One of the columns was called “Signal v. Noise,” another was called “The SIGINT
Philosopher.” Two of the documents cite the academic work of Herbert Simon, who won a Nobel Prize for his pioneering
research on what’s become known as the attention economy. Simon wrote that consumers and managers have trouble
making smart choices because their exposure to more information decreases their ability to understand the information.
Both documents mention the same passage from Simon’s essay, Designing Organizations for an Information-Rich World:
“In
an information-rich world, the wealth of information means a dearth of
something else: a scarcity of whatever it is that information consumes. What
information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates
a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources
that might consume it.” In addition to consulting Nobel-prize winning work, NSA analysts have turned to easier literature,
such as Malcolm Gladwell’s best-selling Blink: The Power of Thinking Without Thinking. The author of a 2011 document
referenced Blink and stated, “The
key to good decision making is not knowledge. It is
understanding. We are swimming in the former. We are desperately lacking in
the latter.” The author added, “Gladwell has captured one of the biggest challenges facing SID today. Our costs
associated with this information overload are not only financial, such as the need to build
data warehouses large enough to store the mountain of data that arrives at our doorstep each day, but also include
the more intangible costs of too much data to review, process, translate and
report.” Alexander, the NSA director from 2005 to 2014 and chief proponent of the agency’s “collect it all” strategy,
vigorously defended the bulk collection programs. “What we have, from my perspective, is a reasonable approach on how
we can defend our nation and protect our civil liberties and privacy,” he said at a security conference in Aspen in 2013. He
added, “You need the haystack to find the needle.” The same point has been made by other officials, including James Cole,
the former deputy attorney general who told a congressional committee in 2013, “If you’re looking for the needle in the
haystack, you have to have the entire haystack to look through.” NSA Slide, May 2011 The opposing viewpoint was voiced
earlier this month by Snowden, who noted in an interview with the Guardian that the men who committed recent terrorist
attacks in France, Canada and Australia were under surveillance—their data was in the haystack yet they weren’t singled
out. “It wasn’t the fact that we weren’t watching people or not,” Snowden said. “It was the fact that we were watching
people so much that we did not understand what we had. The problem is that when you collect it all, when you monitor
everyone, you understand nothing.” In a 2011 interview with SIDtoday, a deputy director in the Signals Intelligence
Directorate was asked about “analytic
modernization” at the agency. His response, while positive on the NSA’s
ability to surmount obstacles, noted that it faced difficulties, including the fact that some targets
use encryption and switch phone numbers to avoid detection. He pointed to
volume as a particular problem. “We live in an Information Age when we have massive reserves of
information and don’t have the capability to exploit it,” he stated. “I was told that there are 2 petabytes of data in the
SIGINT System at any given time. How much is that? That’s equal to 20 million 4-drawer filing cabinets. How many
cabinets per analyst is that? By the end of this year, we’ll have 1 terabyte of data per second coming in. You can’t crank that
through the existing processes and be effective.” The documents noted the difficulty of sifting through the ever-growing
haystack of data. For instance, a 2011 document titled “ELINT Analysts – Overcome by Overload? Help is Here with
IM&S” outlined a half dozen computer tools that “are designed to invert the paradigm where an analyst spends more time
searching for data than analyzing it.” Another document, written by an intelligence analyst in 2010, bluntly stated that “we
are drowning in information. And yet we know nothing. For sure.” The analyst went on to ask, “Anyone know just how
many tools are available at the Agency, alone? Would you know where to go to find out? Anyone ever start a new
target…without the first clue where to begin? Did you ever start a project wondering if you were the sole person in the
Intelligence Community to work this project? How would you find out?” The analyst, trying to encourage more sharing of
tips about the best ways to find data in the haystack, concluded by writing, in boldface, “Don’t let those coming behind you
suffer the way you have.” The agency appears to be spending significant sums of money to solve the haystack problem. The
document headlined “Dealing With a ‘Tsunami’ of Intercept,” written in 2006 by three NSA officials and previously
published by The Intercept, outlined a series of programs to prepare for a near future in which the speed and volume of
signals intelligence would explode “almost beyond imagination.” The document referred to a mysterious NSA entity–the
“Coping With Information Overload Office.” This appears to be related to an item in the Intelligence Community’s 2013
Budget Justification to Congress, known as the “black budget”—$48.6 million for projects related to “Coping with
Information Overload.” The data glut is felt in the NSA’s partner agency in Britain, too. A slideshow entitled “A Short
Introduction to SIGINT,” from GCHQ, the British intelligence agency, posed the following question: “How are people
supposed to keep on top of all their targets and the new ones when they have far more than [they] could do in a day? How
are they supposed to find the needle in the haystack and prioritise what is most important to look at?” The slideshow
continued, “Give
an analyst three leads, one of which is interesting: they may have
time to follow that up. Give them three hundred leads, ten of which are
interesting: that’s probably not much use.” These documents tend to shy away from confrontation—
they express concern with the status quo but do not blame senior officials or demand an abrupt change of course. They
were written by agency staffers who appear to believe in the general mission of the NSA. For instance, the author of a
“SIGINT Philosopher” column wrote that if the NSA was a corporation, it could have the following mission statement:
“building informed decision makers — so that targets do not suffer our nation’s wrath unless they really deserve it — by
exercising deity-like monitoring of the target.” On occasion, however, the veil of bureaucratic deference is lowered. In
another “SIGINT Philosopher” column, “Cognitive Overflow?,” the author offered a forthright assessment of the haystack
problem and the weakness of proposed solutions: “If an individual brain has finite ‘channel capacity,’ does the vast
collective of SID, comprised of thousands of brilliant, yet limited, brains also have a definite ‘channel capacity’? If so, what
is it? How do we know when we’ve reached it? What if we’ve already exceeded it? In essence, could SID’s reach exceed its
grasp? Can the combined cognitive power of SID connect all the necessary dots to avoid, predict, or advise when the
improbable, complex, or unthinkable happens?” The column did not offer an optimistic view. “Take for example the
number of tools, clearances, systems, compliances, and administrative requirements we encounter before we even begin to
engage in the work of the mission itself,” the column continued. “The mission then involves an ever-expanding set of
complex issues, targets, accesses, and capabilities. The
‘cognitive burden,’ so to speak, must at
times feel overwhelming to some of us.” The analyst who wrote the column dismissed, politely but
firmly, the typical response of senior officials when they are asked in public about their ability to find needles in their
expanding haystack. “Surely someone will point out that the burgeoning amalgam of technological advances will aid us in
shouldering the burden,” he noted. “However, historically, this scenario doesn’t seem to completely bear out. The
onslaught of more computer power—often intended to automate some processes—has in many respects
demanded an expansion of our combined ‘channel capacity’ rather than curbing
the flow of the information.”
Mass surveillance collects too much data – hurts the fight against
teror
Angwin, award-winning senior reporter at the Wall Street Journal, ’13
(JULIA ANGWIN, Wall Street Journal, “NSA Struggles to Make Sense of Flood of Surveillance
Data” Dec. 25, 2013 10:30 p.m. ET
http://www.wsj.com/articles/SB10001424052702304202204579252022823658850) //GY
LAUSANNE, Switzerland— William Binney, creator of some of the computer code used by the
National Security Agency to snoop on Internet traffic around the world, delivered an unusual
message here in September to an audience worried that the spy agency knows too much. ¶ It knows
so much, he said, that it can't understand what it has.¶ "What they are doing is making themselves
dysfunctional by taking all this data," Mr. Binney said at a privacy conference here. ¶ The agency
is drowning in useless data, which harms its ability to conduct legitimate
surveillance, claims Mr. Binney, who rose to the civilian equivalent of a general during more
than 30 years at the NSA before retiring in 2001. Analysts are swamped with so much
information that they can't do their jobs effectively, and the enormous stockpile is
an irresistible temptation for misuse.¶ Mr. Binney's warning has gotten far less attention
than legal questions raised by leaks from former NSA contractor Edward Snowden about the
agency's mass collection of information around the world. Those revelations unleashed a re-
examination of the spy agency's aggressive tactics.¶ ¶ But the NSA needs more room to
store all the data it collects—and new phone records, data on money transfers and
other information keep pouring in. A new storage center being built in Utah will eventually
be able to hold more than 100,000 times as much as the contents of printed materials in the
Library of Congress, according to outside experts.¶ Some of the documents released by Mr.
Snowden detail concerns inside the NSA about drowning in information. An internal briefing
document in 2012 about foreign cellphone-location tracking by the agency said
the efforts were "outpacing our ability to ingest, process and store" data.¶ In
March 2013, some NSA analysts asked for permission to collect less data through
a program called Muscular because the "relatively small intelligence value it
contains does not justify the sheer volume of collection," another document shows.¶ In
response to questions about Mr. Binney's claims, an NSA spokeswoman says the agency is "not
collecting everything, but we do need the tools to collect intelligence on foreign adversaries who
wish to do harm to the nation and its allies."¶ Existing surveillance programs were approved by
"all three branches of government," and each branch "has a role in oversight," she adds.¶ In a
statement through his lawyer, Mr. Snowden says: "When your working process every
morning starts with poking around a haystack of seven billion innocent lives,
you're going to miss things." He adds: "We're blinding people with data we don't need."¶ A
presidential panel recommended earlier this month that the agency shut down its bulk collection
of telephone-call records of all Americans. The federal government could accomplish the same
goal by querying phone companies, the panel concluded.¶ The panel also recommended the
creation of "smart software" that could sort data as the information is collected, rather than the
current system where "vast amounts of data are swept up and the sorting is done after it has been
copied" on to data-storage systems. Administration officials are reviewing the report.¶ A separate
task force is expected to issue its own findings next year, and lawmakers have proposed several
bills that would change how the NSA collects and uses data.¶ The 70-year-old Mr. Binney says he
is generally underwhelmed by the panel's "bureaucratic" report, though "it would be something
meaningful" if the controversy leads to adoption of the "smart software" strategy and creation of a
technology oversight group with full access to "be in the knickers of the NSA" and Federal Bureau
of Investigation.¶ Mr. Binney lives off his government pension and makes occasional appearances
to talk about his work at the NSA.¶ The spy agency has defended its sweeping
surveillance programs as essential in the fight against terrorism. But having too
much data can hurt those efforts, according to Mr. Binney and a handful of colleagues who
have raised concerns since losing an internal battle to build privacy-protecting Internet
surveillance tools in the late 1990s.¶
Turns military readiness
Data overload wrecks military readiness and training
Erwin, National Defense Magazine, ‘12 (Sandra I. Erwin, National Defense
Industrial Association, “Too Much Information, Not Enough Intelligence” May 2012,
http://www.nationaldefensemagazine.org/archive/2012/May/Pages/TooMuchInformation,NotE
noughIntelligence.aspx) //GY
The Defense Department over the last decade has built up an inventory of billions of dollars worth
of spy aircraft and battlefield sensors. Those systems create avalanches of data that clog
military information networks and overwhelm analysts.¶ Intelligence experts say
the military is drowning in data but not able to convert that information into
intelligible reports that break it down and analyze it. ¶ “The challenge for users of
intelligence is that all the different types of information come in a stove-piped manner,” says
Michael W. Isherwood, a defense analyst and former Air Force fighter pilot.¶ Intelligence feeds
include electronic signals, satellite imagery, moving-target data and full-motion video. “How do
you integrate this into a clear picture?” Isherwood asks. “That is one of the enduring
challenges in the ISR [intelligence, surveillance and reconnaissance] arena for all the
services.”¶ Isherwood, the author of a Mitchell Institute white paper, titled, “Layering ISR
Forces,” cautions that success in future operations hinges on “timely, astute combinations of ISR
resources.”¶ The Pentagon would be wise to shift its future investments from sensors to dataanalysis tools, he says.¶ “The awareness gained from integrated, multi-source intelligence data is
of supreme value,” says Isherwood. ¶ In actual combat, a coherent picture of the battlefield is not a
“routine event,” he says. “Coalition forces in Afghanistan have suffered losses when they were
surprised by a much larger insurgent force not detected in time by ISR assets.Ӧ Military drone
operators amass untold amounts of data that never is fully analyzed because it is
simply too much, Isherwood says.¶ In the Air Force alone, the buildup of data collectors has
been dramatic. While its inventory of fighter, bomber, tanker and transport aircraft shrank by 11
percent over the past decade, ISR platforms — primarily unmanned air vehicles — increased by
nearly 300 percent, says Isherwood.¶ Air Force leaders have recognized this problem and recently
decided to cut its future purchases of Reaper drones in half — from 48 to 24 — because there is
not enough manpower to operate and process the data from more aircraft. “It didn’t make
sense to have the production out that far ahead of our ability to actually do the
processing and exploitation and dissemination function,” Deputy Assistant Secretary
of the Air Force for Budget Marilyn Thomas says at a February news conference. ¶ The military
services have funded programs to develop software algorithms to automate data analysis, but no
silver bullet has emerged. ¶ “Industry is working on tools so you can pull a Google Earth image and
incorporate the SIGINT [signals intelligence], the MTI [moving target indicator], visual imagery,
full-motion video,” Isherwood says. ¶ What the military needs is a “decathlete analyst” that can
process multiple feeds, versus an operator for each type of data, he says. Defense Department
leaders understand the problem, but the “acquisition community now needs to take
that and translate it into systems” that tackle this challenge.¶ The Air Force is “really
good at building an airplane,” Isherwood says. But he has yet to see a comparable requirements
document or request for technology that meshes all the sensors, he adds. “They go after it
piecemeal.”¶ The information deluge problem also is exacerbated by the military’s
organizational silos that zealously protect their data. ¶ “It’s hard to get the
community to plug their sensors in,” says Gregory G. Wenzel, vice president of advanced
enterprise integration at Booz Allen Hamilton.¶ The so-called “PED” process — processing,
exploitation and dissemination — has been a long-standing challenge, he says. “It’s a really hard
problem.Ӧ Automated analysis tools for video feeds are gradually entering the market, Wenzel
says. The National Football League has developed software to search video archives that some
defense contractors are using as a model. ¶ One of the more promising systems that could help
military ISR operators manage data more efficiently is the DI2E, or defense intelligence
information enterprise, says Wenzel. The entire Defense Department and intelligence community
will be able to share information, he says. The DI2E is a cloud-based system that draws data from
many sensors and databases. ¶ Technologies such as DI2E are part of a larger trend toward
networking sources of information, says Richard Sterk, senior aerospace and defense analyst at
Forecast International. “There’s still too many stand alone legacy systems.”¶
Regardless of advances in technology, he says, a larger conundrum for the
military is figuring out how to manage information so commanders and troops in
the field don’t become overwhelmed. “They have to sort out how much
information is enough,” says Sterk.¶ The Office of Naval Research and the Marine Corps have
been experimenting with another approach to analyzing data known as “semantic wiki.” ¶ It solves
the “intelligence fusion” problem, says George Eanes, vice president of Modus Operandi, a small
firm that developed the wiki tool. ¶ It’s a rather simple approach. “If I’m looking for something of
interest, like a white van, I can search across all the data stores that I have access to,” Eanes says.
“It presents it in a wiki format. … It’s a really good tool for pulling the data in from multiple
sources and present it in one convenient application.Ӧ Semantic wiki can search video, human
intelligence reports and satellite imagery. Streaming video could be added in the future, he says.
The company has spent the past three to four years working on this technology under several
small business innovation research contracts worth about $5 million, says Eanes. ¶ “There has
been a cultural shift within the Defense Department toward more desire to share information,”
Eanes says. “First they thought the solution was to bring everything into a single database. But
that proved impractical. There is too much data,” he says. “Now they’re looking at other solutions.
You keep the data where it resides. You access only the data you need.” ¶ Former Marine Corps
intelligence analyst Tony Barrett, who is now at Modus Operandi, says that during his time on
active duty, his team was overwhelmed by data. He would have liked to have had software to scan
unstructured data and provide relevant information, based on queries the analyst sets up, he says.
“That frees up the analyst to do due diligence rather than extended periods of research,” he adds.
“In Iraq, I had individual analysts that all they did was scan reports and find which ones were
relevant. … Research is extremely frustrating. I would rather my guys spent more time thinking.” ¶
Because of the data overload, “What you end up doing is taking your smartest
Marines who would be your biggest help in problem solving to work on your
system’s problems,” Barrett says. “I had my smartest guys always be the principal researchers
because I was more confident they would be able to discover more data than less talented
analysts.Ӧ ISR experts also worry that the military has become addicted to full-
motion video, at the expense of other intelligence disciplines that might gradually
disappear as the number of skilled operators declines. Video imagery is the most
“readily understood” intelligence, says Isherwood.¶ For the Iraq and Afghanistan wars, full-motion
video provided by aerial sensors was the preferred form of surveillance. But for other combat
scenarios in the future, Isherwood says, the military might need to rely on other types of
data such as signals intelligence (collection of electronic intercepts or emissions), moving
target indicator data (Doppler shifts of moving objects to detect and track targets), radar
imagery; and measurement and signals intelligence (combines radar, laser, optical,
infrared, acoustic, electromagnetic and atmospheric mediums to identify objects). There is also
“cyber-intelligence,” a new discipline that is based on electronic-warfare techniques, says
Isherwood. ¶ “Full motion video is what everybody wants,” says Chief of Naval Operations Adm.
Jonathan Greenert. “A still picture is good but you still have to send it back, develop it quickly,” he
says.¶ Access to full-motion video, however, might not be feasible in every conflict. “Not all fights
will be in the desert,” says Mel French, vice president of development at Telephonics, a supplier of
military sensors and electronic warfare systems. ¶ The unmanned aircraft-mounted
sensors that are favored today might not work in other environments. “The second
you introduce rain to any of those systems, the range goes down, it limits utility,” says French.¶
“We need to think of where else we are going to go,” he says. “Possibly places where we
need foliage penetration. That’s a hard problem to solve.”¶ The full-motion video soda straw view
works when the area is not being defended by adversaries who can shoot down surveillance
aircraft, he says.¶ In instances when ISR assets might be in danger and rather kept at
standoff ranges, the military will need analysts who can discern other forms of
data such as synthetic aperture radar images, French says. Some field commanders
might complain that they “don’t understand the [SAR] shadows,” he says. They might not realize
that video camera pictures can’t be obtained from 200 miles away. Images such as SAR require a
trained eye. ¶ As to whether there will be a time when analysts will be able to produce “actionable”
intelligence, French says there are no easy answers. ¶ “It’s one of those problems that will require
years of investments and focus,” he says. “We fielded a lot of Band-Aids. Now it’s getting back to
rationalizing what we fielded.”
Excess surveillance data hampers military effectiveness and creates a
drag on the economy
Claburn 9 (Thomas, Editor at Large, Enterprise Mobility, 7/9, “Military Grapples With
Information Overload,” http://www.informationweek.com/architecture/military-grapples-withinformation-overload/d/d-id/1081209?//Tang)
Surging surveillance data threatens to overwhelm the military's ability to deal with the
information. A report from a defense advisory group is calling for new data analysis technology and for taking a cue
from Google. Information overload has become a significant challenge for the U.S. military and will
require new analysis software and a Google-style cloud infrastructure to manage massive data sets, a U.S. defense
advisory group report finds. The December 2008 report, "Data Analysis Challenges," was initially
withheld from the public. It was obtained by the Federation of American Scientists' Project on
Government Secrecy through a Freedom of Information Act request. The report, written by JASON, a
group that provides advice to the Department of Defense (DoD) through the non-profit MITRE Corporation, says that the
massive amount of sensor and imagery data being gathered is becoming increasingly difficult to
store, analyze, and integrate into defense systems. For example, a DoD surveillance system called
Constant Hawk typically produces 10's to 100's of Terabytes of data over a period of a few hours.
For that information to be useful, it has to be stored, analyzed, and distributed quickly. The report,
however, cites concerns voiced by members of the defense and intelligence communities that much of the
surveillance data gathered isn't made useful. "Seventy percent of the data we collect is falling on
the floor," MIT defense research scientist Pete Rustan said, according to the report. And the problem is
likely to get worse. "As the sensors associated with the various surveillance missions improve, the data volumes are
increasing with a projection that sensor data volume could potentially increase to the level of Yottabytes
(10^24 Bytes) by 2015," the report says. Jonathan B. Spira, CEO and chief analyst at research consultancy Basex, author of
the forthcoming book Overload!, and organizer of Information Overload Awareness Day on August 12, says
information overload is a real problem in the workplace, in government and in the military . "We've
seen on the military side, many instances where information overload can create a whole new kind of fog
[of war]," he said. Information overload costs the U.S. economy $900 billion per year, according to
Basex. The JASON report discounts some of the more extreme projections about data volume growth and recommends
that the DoD deploy infrastructure similar to that used by Google, Microsoft, and Yahoo. It also sees military applications
for the Hive language used by Facebook for data warehousing. The major problem the DoD faces will be in the
area of automated information analysis. "The notion of fully automated analysis is today at best a
distant reality, and for this reason, it is critical to invest in research to promote algorithmic advances," the report says.
"One way to effectively engage the relevant research communities is through the use of grand challenges in the area of data
analysis." Spira sees information overload
as a broader problem, one that won't vanish with the
development of improved automated information analysis technology. He described a
cybersecurity conference at a Maxwell Airforce Base, where military brass had gathered to discuss cyber
threats. Emerging from the talk, the generals found they had no e-mail, he said. It turned out that the base's e-mail
system had been taken down, not by a cyber attack, but by an e-mail about a card game that got
forwarded and, through too many reply-alls, multiplied until over a million messages overloaded
the e-mail servers. "We need to address a lot of different aspects of data and information overload, not just things that
sound sexy," said Spira.
Data collection has reached the neurological breaking point –
additional data will make it impossible to function in the field
Shanker and Ritchel 11 (Thom and Matt, writers for the NYT, cites psychologists and
neuroscientists, 1/16, “In New Military, Data Overload Can Be Deadly,”
http://www.nytimes.com/2011/01/17/technology/17brain.html?pagewanted=all&_r=0//Tang)
When military investigators looked into an attack by American helicopters last
February that left 23 Afghan civilians dead, they found that the operator of a
Predator drone had failed to pass along crucial information about the makeup of
a gathering crowd of villagers. But Air Force and Army officials now say there was also an
underlying cause for that mistake: information overload. At an Air Force base in Nevada, the
drone operator and his team struggled to work out what was happening in the
village, where a convoy was forming. They had to monitor the drone’s video feeds while
participating in dozens of instant-message and radio exchanges with intelligence
analysts and troops on the ground. There were solid reports that the group included children, but the
team did not adequately focus on them amid the swirl of data — much like a cubicle worker who loses track of an
important e-mail under the mounting pile. The team was under intense pressure to protect American forces nearby, and in
the end it determined, incorrectly, that the villagers’ convoy posed an imminent threat, resulting in one of the worst losses
of civilian lives in the war in Afghanistan. “Information overload — an accurate description,” said one senior military
officer, who was briefed on the inquiry and spoke on the condition of anonymity because the case might yet result in a
court martial. The deaths
would have been prevented, he said, “if we had just slowed
things down and thought deliberately.” Data is among the most potent weapons of
the 21st century. Unprecedented amounts of raw information help the military
determine what targets to hit and what to avoid. And drone-based sensors have
given rise to a new class of wired warriors who must filter the information sea. But
sometimes they are drowning. Research shows that the kind of intense multitasking required in such situations can make
it hard to tell good information from bad. The military
faces a balancing act: how to help soldiers
exploit masses of data without succumbing to overload. Across the military, the
data flow has surged; since the attacks of 9/11, the amount of intelligence
gathered by remotely piloted drones and other surveillance technologies has risen
1,600 percent. On the ground, troops increasingly use hand-held devices to communicate, get directions and set
bombing coordinates. And the screens in jets can be so packed with data that some pilots
call them “drool buckets” because, they say, they can get lost staring into them.
“There is information overload at every level of the military — from the general to the soldier
on the ground,” said Art Kramer, a neuroscientist and director of the Beckman Institute, a research lab at the University of
Illinois. The
military has engaged researchers like Mr. Kramer to help it understand
the brain’s limits and potential. Just as the military has long pushed technology forward, it is now at the
forefront in figuring out how humans can cope with technology without being overwhelmed by it. At George Mason
University in Virginia, researchers measure the brain waves of study subjects as they use a simulation of the work done at
the Nevada Air Force base. On a computer screen, the subjects see a video feed from one drone and the locations of others,
along with instructions on where to direct them. The subjects wear a cap with electrodes attached, measuring brain waves.
As the number of drones and the pace of instructions increases, the brain shows
sharp spikes in a kind of electrical activity called theta — cause for concern
among the researchers. “It’s usually an index of extreme overload,” said Raja
Parasuraman, a director of the university’s human factors and applied cognition program. As the technology
allows soldiers to pull in more information, it strains their brains. And military
researchers say the stress of combat makes matters worse. Some research even suggests that younger people wind up
having more trouble focusing because they have grown up constantly switching their attention. For the soldier who has
been using computers and phones all his life, “multitasking might actually have negative effects,” said Michael Barnes,
research psychologist at the Army Research Lab at Aberdeen, Md., citing several university studies on the subject. In tests
at a base in Orlando, Mr. Barnes’s group has found that when soldiers operate a tank while monitoring remote video feeds,
they often fail to see targets right around them. Mr.
Barnes said soldiers could be trained to use
new technology, “but we’re not going to improve the neurological capability.” On the
other hand, he said, the military should not shy away from improving the flow of data in combat. “It would be like saying
we shouldn’t have automobiles because we have 40,000 people die on the roads each year,” he said. “The pluses of
technology are too great.” The military is trying novel approaches to helping soldiers focus. At an Army base on Oahu,
Hawaii, researchers are training soldiers’ brains with a program called “mindfulness-based mind fitness training.” It asks
soldiers to concentrate on a part of their body, the feeling of a foot on the floor or of sitting on a chair, and then move to
another focus, like listening to the hum of the air-conditioner or passing cars. “The whole question we’re asking is whether
we can rewire the functioning of the attention system through mindfulness,” said one of the researchers, Elizabeth A.
Stanley, an assistant professor of security studies at Georgetown University. Recently she received financing to bring the
training to a Marine base, and preliminary results from a related pilot study she did with Amishi Jha, a neuroscientist at
the University of Miami, found that it helped Marines to focus. Even as it worries about digital overload, the Army is
acknowledging that technology may be the best way to teach this new generation of soldiers — in particular, a technology
that is already in their pockets. In Army basic training, new recruits can get instruction from iPhone apps on subjects as
varied as first aid and military values. As part of the updated basic training regimen, recruits are actually forced into
information overload — for example, testing first aid skills while running an obstacle course. “It’s the way this generation
learns,” said Lt. Gen. Mark P. Hertling, who oversees initial training for every soldier. “It’s a multitasking generation. So if
they’re multitasking and combining things, that’s the way we should be training.” The intensity of warfare in the computer
age is on display at a secret intelligence and surveillance installation at Langley Air Force Base in Virginia, a massive,
heavily air-conditioned warehouse where hundreds of TVs hang from black rafters. Every
day across the Air
Force’s $5 billion global surveillance network, cubicle warriors review 1,000
hours of video, 1,000 high-altitude spy photos and hundreds of hours of “signals
intelligence” — usually cellphone calls. At the Langley center, officially called Distributed Common
Ground System-1, heavy multitasking is a daily routine for people like Josh, a 25-year-old first lieutenant (for security
reasons, the Air Force would not release his full name). For 12 hours a day, he monitors an avalanche of images on 10
overhead television screens. They deliver what Josh and his colleagues have nicknamed “Death TV” — live video streams
from drones above Afghanistan showing Taliban movements, suspected insurgent safehouses and American combat units
headed into battle. As he watches, Josh uses a classified instant-messaging system showing as many as 30 different chats
with commanders at the front, troops in combat and headquarters at the rear. And he is hearing the voice of a pilot at the
controls of a U-2 spy plane high in the stratosphere. “I’ll have a phone in one ear, talking to a pilot on the headset in the
other ear, typing in chat at the same time and watching screens,” Josh says. “It’s intense.” The stress lingers when the shift
is over. Josh works alongside Anthony, 23, an airman first class who says his brain hurts each night, the way feet ache
after a long march. “You have so much information coming in that when you go home — how do you take that away?
Sometimes I work out,” Anthony said. “Actually, one of my things is just being able to enjoy a nice bowl of cereal with
almond milk. I feel the tension is just gone and I can go back again.” Video games don’t do the trick. “I need something
real,” he said.
We’ve gotta deal with overload now to improve counterterrorism long
term
Shanker and Richtel, 11
(Thom and Matt, Graduate Tufts University - The Fletcher School of Law and Diplomacy, Pulitzer
prize winning author, New York Times, 1-16-11, “In New Military, Data Overload Can Be Deadly”,
http://www.umsl.edu/~sauterv/DSS4BI/links/17brain.pdf, amp)
Mr. Barnes said soldiers could be trained to use new technology, “but we’re not going
to improve the neurological capability.” On the other hand, he said, the military should not shy
away from improving the flow of data in combat. “It would be like saying we
shouldn’t have automobiles because we have 40,000 people die on the roads each
year,” he said. “The pluses of technology are too great.”
A2: tech solves
Tech can’t solve – it’s not fast enough
Horvitz, 13
Leslie Alan Horvitz, American author, “Information Overload: Babel, Borges and the NSA,”
7/2/13, http://lesliehorvitz.com/blog/2013/7/2/information-overload-babel-borges-and-the-nsa
// IS
NSA and other security agents rely on computers using a variety of algorithms
(some of them designed to search for key words like ‘terrorism’) to find the
hoped-for needles in the ever expanding haystack. But I suspect that
technology is incapable of keeping up. The data threatens to become
indigestible. As soon as you bring humans into the equation – and eventually you
need analysts to assess the credibility of the information and determine whether it is actionable or
not – you run the risk of errors, bad judgment and bias. And it takes time – lots of time. So
analysts couldn’t get to them all; instead they put aside what used to be called “bit buckets” in the
industry —electronic bits that someday would have to be sorted out…by someone. According to
James Lewis, a cyberexpert quoted in The New York Times, “They park stuff in storage in
the hopes that they will eventually have time to get to it,” although he admitted
that “most of it sits and is never looked at by anyone.” As another expert put it: “This
means that if you can’t desalinate all the seawater at once, you get to hold on to the ocean until
you figure it out.”
Technology doesn’t check mass surveillance inefficiencies
Ferguson, 1/16 (DAVID FERGUSON, journalist Raw Story, “Mass surveillance is ineffective
at fighting terrorism and makes us less safe, says tech expert” 16 JAN 2015 AT 12:53 ET
http://www.rawstory.com/2015/01/mass-surveillance-is-ineffective-at-fighting-terrorism-andmakes-us-less-safe-says-tech-expert/) //GY
Mass surveillance has proven to be an ineffective tool against terrorists, and yet in
the wake of the attacks on the offices of the French satirical magazine Charlie Hebdo, many
politicians are calling for even tighter surveillance on private citizens.¶ In a Thursday column for
New Scientist, Open University technology specialist Ray Corrigan explained that mass
electronic surveillance will never be an effective means of ensuring public safety,
no matter how sophisticated the technology becomes or how granular a
level at which officials become capable of examining our lives.¶ “Prime Minister David
Cameron wants to reintroduce the so-called snoopers’ charter — properly, the Communications
Data Bill — which would compel telecoms companies to keep records of all internet, email and
cellphone activity,” wrote Corrigan. The Prime Minister also wants to ban all forms of encrypted
communication like Apple iMessage and the message service WhatsApp.¶ However, Corrigan
pointed out, “Brothers Said and Cherif Kouachi and Amedy Coulibaly, who
murdered 17 people, were known to the French security services and considered a
serious threat. France has blanket electronic surveillance. It didn’t avert what
happened.Ӧ In France, authorities lost track of the extremists just long enough for
them to carry out their attack. Surveillance systems are imperfect, Corrigan said, and
blanket data gathering is a wildly inefficient way to weed out potential terror
suspects. It generates too much useless information to sift through, he said, and
often misses vital information that only becomes clear in hindsight.¶ “You cannot fix
any of this by treating the entire population as suspects and then engaging in suspicionless,
blanket collection and processing of personal data,” he said. It simply doesn’t work.¶ In fact, the
practice may make populations less safe by generating so much data that it
becomes statistically impossible for investigators to spot actual leads, generating
false positives at an astonishing rate.¶ “Even if your magic terrorist-catching machine has a
false positive rate of 1 in 1000 — and no security technology comes anywhere near this — every
time you asked it for suspects in the UK it would flag 60,000 innocent people,” said Corrigan. ¶
“Surveillance of the entire population, the vast majority of whom are innocent, leads to the
diversion of limited intelligence resources in pursuit of huge numbers of false leads. Terrorists
are comparatively rare, so finding one is a needle in a haystack problem. You
don’t make it easier by throwing more needleless hay on the stack,” he wrote.¶ In the
U.S., a series of revelations from intelligence contractor turned whistleblower Edward Snowden
revealed programs through which the National Security Agency is gathering information on
average citizens, outraging privacy advocates and opening an international debate on the legality
of mass surveillance. Now, in addition to being legally dubious, years into the surveillance
programs, the practice of indiscriminate data-gathering has neither caught any
terrorists nor prevented any attacks.¶ On Friday, the American Civil Liberties Union
reported on the newly-released results of a year-long investigation by the National Academies:
Bulk Collection of Signals Intelligence: Technical Operations 2015.¶ Neema Singh Guliani of the
ACLU revealed that the report showed “the domestic nationwide call detail record program has
never stopped an act of terrorism or led to the identification of a terrorist suspect.” ¶ Furthermore,
“the report did not find that the resource costs, privacy impacts, and economic
harms associated with bulk collection are balanced by any concrete benefits in
intelligence capabilities,” Guliani wrote.¶ “Finally,” she said, “the report acknowledges that
there are additional steps that the intelligence community can take to increase transparency,
improve oversight, and limit the use of information collected through surveillance.” ¶ In his
column, Corrigan wrote that law enforcement agencies need to “use modern digital technologies
intelligently and through targeted data preservation — not a mass surveillance regime — to
engage in court-supervised technological surveillance of individuals whom they have reasonable
cause to suspect.”¶ “That is not, however,” he insisted, “the same as building an infrastructure of
mass surveillance.”
Technology can’t check overload
The SIGINT Philosopher, 11
The SIGINT Philosopher, a Russian language analyst employed by SID, “The SIGINT
Philosopher: Cognitive Overload?” 4/15/11,
https://s3.amazonaws.com/s3.documentcloud.org/documents/2088972/cognitive-overflow.pdf
// IS
(U) There's a computer sitting atop your shoulders. Granted, real computers can apparently best
human brains on Jeopardy with ease, but all the same... Since Noam Chomsky and his cohorts at
MIT opened the floodgates to the study of how we think, cognitive psychology has come a long
way. Although the ensuing decades of research have highlighted the astounding capabilities of our
gray matter, the field has also exposed the limitations our brains are subject to. It may be worth
considering the implications these limitations have for our work in SID.
(U) "Channel capacity" is the term some cognitive psychologists have begun to
apply to the brain's limits on the amount of certain information it can retain. For
instance, research shows that the average person can only differentiate between 5-9
different tones, shapes, or textures at a given time. Any more, and our capacity to
categorize becomes overtaxed, and we begin to make mistakes. In other words,
the servers overload. It is yet another example of how unprepared humans are, in
evolutionary terms, for the information age. Evolutionary biologist Sherwood Washburn once
wrote:
(U) "Most of human evolution took place before the advent of agriculture, when we lived in small
groups, face-to-face. Man evolved to feel strongly about few people, short distances, and relatively
brief intervals."
(U) The question then becomes: If an individual brain has finite "channel capacity,"
does the vast collective of SID, comprised of thousands of brilliant, yet limited,
brains also have a definite "channel capacity"? If so, what is it? How do we know when
we've reached it? What if we've already exceeded it? In essence, could SID's reach exceed its
grasp? Can the combined cognitive power of SID connect all the necessary dots to avoid, predict,
or advise when the improbable, complex, or unthinkable happens?
(U) Take for example the number of tools, clearances, systems, compliances, and
administrative requirements we encounter before we even begin to engage in the work of
the mission itself. The mission then involves an ever-expanding set of complex
issues, targets, accesses, and capabilities. The "cognitive burden," so to speak,
must at times feel overwhelming to some of us. The SID is an organism with
many moving parts. So how do we ensure our SIGINT potential is in line with,
and doesn't overwhelm our collective cognitive capacity? Can we count on our
overarching SID mechanism to self-regulate, to organically cull, sort, and retain? Or is there
perhaps something extra we ought to be doing to ensure we operate at full exploitative and
analytic force?
(U) Surely someone will point out that the burgeoning amalgam of technological
advances will aid us in shouldering the burden. However, historically, this
scenario doesn't seem to completely bear out. The onslaught of more computing
power—often intended to automate some processes-has in many respects
demanded an expansion of our combined "channel capacity," rather than curbing
the flow of the information that's necessary to retain.
(U) It's an issue worth thinking about and discussing. In the meantime, I'll be working on my 14character p ass word...
(U) Editor's note: See a Tapioca Pebble on this topic.
A2: more data solves
More data can’t solve – current spending and Britain prove
Maass, 15
Peter Maass, a Guggenheim Fellow on the advisory boards of the Solutions Journalism Network,
and the Program for Narrative and Documentary Practice at Tufts University, “Inside NSA,
Officials Privately Criticize ‘Collect it All’ Surveillance,” The Intercept, 5/28/15,
https://firstlook.org/theintercept/2015/05/28/nsa-officials-privately-criticize-collect-it-allsurveillance/ // IS
The agency appears to be spending significant sums of money to solve the
haystack problem. The document headlined “Dealing With a ‘Tsunami’ of Intercept,” written
in 2006 by three NSA officials and previously published by The Intercept, outlined a series of
programs to prepare for a near future in which the speed and volume of signals intelligence would
explode “almost beyond imagination.” The document referred to a mysterious NSA
entity–the “Coping With Information Overload Office.” This appears to be related
to an item in the Intelligence Community’s 2013 Budget Justification to
Congress, known as the “black budget”—$48.6 million for projects related to
“Coping with Information Overload.” The data glut is felt in the NSA’s partner
agency in Britain, too. A slideshow entitled “A Short Introduction to SIGINT,” from GCHQ,
the British intelligence agency, posed the following question: “How are people
supposed to keep on top of all their targets and the new ones when they have far
more than [they] could do in a day? How are they supposed to find the needle in
the haystack and prioritise what is most important to look at?” The slideshow
continued, “Give an analyst three leads, one of which is interesting: they may
have time to follow that up. Give them three hundred leads, ten of which are
interesting: that’s probably not much use.”
More data fails – statistics – and their evidence is hype
Bergen et al., 14
Peter Bergen, David Sterman, Emily Schneider, and Bailey Cahill, *Peter Bergen is an American
print and broadcast journalist, author, documentary producer, and CNN's national security
analyst. **David Sterman is a program associate at New America and holds a master's degree
from Georgetown's Center for Security Studies, ***senior program associate for the International
Security Program at New America, “Do Nsa's Bulk Surveillance Programs Stop Terrorists?” New
America Foundation, January 2014,
https://www.newamerica.org/downloads/IS_NSA_surveillance.pdf // IS
The administration has repeatedly exaggerated the role of NSA bulk
surveillance programs in preventing terrorism and is misleading the public when
it says that 9/11 could have been prevented by such programs when, in fact,
better information-sharing about already existing intelligence would have been
far more effective in preventing 9/11.
D.
Members of Congress,
senior government officials, and NSA officials have justified
the programs with statements about how many terrorist events the surveillance
programs have foiled - citing a total of 54 “events” around the globe, of which 13
were in the United States - and have warned of the risk of a future 9/11-like attack if
the programs were curtailed. As mentioned above, President Obama defended the NSA
surveillance programs during a visit to Berlin in June, saying: “We know of at least 50 threats that
have been averted because of this information not just in the United States, but, in some cases,
threats here in Germany. So lives have been saved.”39 Gen. Alexander testified before Congress
that: “the information gathered from these programs provided the U.S. government with critical
leads to help prevent over 50 potential terrorist events in more than 20 countries around the
world.”40 Rep. Mike Rogers, chairman of the House Permanent Select Committee on
Intelligence, said on the chamber floor in July that NSA programs “stopped and thwarted terrorist
attacks both here and in Europe - saving real lives” a total of 54 times.41
The government’s defense has demonstrated a lack of precision regarding the
exact nature of the threats in the terrorism cases the government has claimed
were prevented by NSA surveillance. Were they real attacks that were thwarted?
Serious plots that were still somewhere in the planning stages? Plots that were
concerning, but never really operational? Or did they involve some sort of
terrorism-support activity, such as fundraising? President Obama has called them
“threats,” Gen. Alexander called them “events” and then later used the term “activities,” while
Rep. Rogers and one of Gen. Alexander’s slides at the 2013 Black Hat conference referred to them
as “attacks.”42
Sen. Leahy brought attention to this disconnect at a Senate Judiciary Committee hearing in July
2013, saying he had been shown a classified list of “terrorist events” detected through surveillance
which did not show that “dozens or even several terrorist plots” had been thwarted by the
collection of American telephone metadata under Section 215.43 Sen. Leahy asked Gen.
Alexander: “Would you agree that the 54 cases that keep getting cited by the
administration were not all plots, and of the 54, only 13 had some nexus to the
U.S.?” and Gen. Alexander’s reply was a simple “Yes.”44 On this key point, beyond his
one-word answer, the NSA director did not elaborate while under oath.
Leading reporters have sometimes simply parroted the government claims that
more than 50 attacks have been averted. Bob Schieffer of CBS News, for instance, said on
“Face the Nation” on July 28: “Fifty-six terror plots here and abroad have been thwarted by the
NASA [sic\ program. So what’s wrong with it, then, if it’s managed to stop 56 terrorist attacks?
That sounds like a pretty good record.”45 This misrepresentation in the media most
likely stems from confusion about what this oft-cited 54 number really refers to terrorist activity such as fundraising, plots that were really only notional, or
actual averted attacks.
Despite the government’s narrative that NSA surveillance of some kind prevented 13 domestic
“events” or “attacks” in the United States, of the eight cases we have identified as
possibly involving the NSA, including the three the government has not claimed,
only one can be said to involve an operational al-Qaeda plot to conduct an attack
within the United States, three were notional plots, and one involved an attack
plan in Europe. And in three of the plots we identified as possibly having been
prevented by the N SA - Moalin, Muhtorov and Jumaev, and Warsame - the
defendants were committing or allegedly committing crimes of support for a
terrorist group, rather than plotting terrorist attacks.
More data fails – empirics
Bergen et al., 14
Peter Bergen, David Sterman, Emily Schneider, and Bailey Cahill, *Peter Bergen is an American
print and broadcast journalist, author, documentary producer, and CNN's national security
analyst. **David Sterman is a program associate at New America and holds a master's degree
from Georgetown's Center for Security Studies, ***senior program associate for the International
Security Program at New America, “Do Nsa's Bulk Surveillance Programs Stop Terrorists?” New
America Foundation, January 2014,
https://www.newamerica.org/downloads/IS_NSA_surveillance.pdf // IS
These multiple missed opportunities challenge the administration’s claims that the
NSA’s bulk surveillance program could have prevented the 9/11 attacks. The key
problem was one of information-sharing, not lack of information. If
information-sharing had been functioning, Mihdhar would likely have been
caught regardless of the collection of telephone metadata, and if information- sharing was
not functioning, it is unclear why collecting more information would have
changed the result. Even if Mihdhar’s phone calls from San Diego to Yemen is considered a
moment for preventing the 9/11 attacks, it is likely that more targeted surveillance of that phone
number rather than bulk collection of metadata would have been sufficient. Communications to
and from the house in Yemen were already being intercepted by the NSA as a result of
investigations into the 1998 U.S. embassy bombings in Africa and the USS Cole bombing in
2000.62 According to U.S. officials quoted by Josh Meyer, a leading national security reporter at
the Los Angeles Times, the information from the calls could have been shared through a FISA
warrant under the authorities the NSA had even before 9/11.63 The United States government
could and should have been alerted to Mihdhar’s phone calls even without the expanded authority
to collect the telephone metadata of all Americans under Section 215. Indeed, Richard Clarke, the
national coordinator for security, infrastructure protection, and counterterrorism from 1998 to
2001, has explained that the Justice Department “could have asked the FISA Court for a warrant
to all phone companies to show all calls from the U.S. which went to the Yemen number. As far as
I know, they did not do so. They could have.”64 Clarke played down the need for bulk collection
in such a scenario, continuing, “My understanding is that they did not need the current All Calls
Data Base FISA warrant to get the information they needed. Since they had one end of the calls
(the Yemen number), all they had to do was ask for any call connecting to it.”65 (Clarke was one
of the five members of the White House review group that President Obama established in August
2013 to review the U.S. government’s surveillance activities and which issued its report on
December 18, 2013). The overall problem for U.S. counterterrorism officials is not
that they need the information from the bulk collection of phone data, but that
they don’t sufficiently understand or widely share the information they already
possess that is derived from conventional law enforcement and intelligence
techniques. This was true of the two 9/11 hijackers living in San Diego and it is
also the unfortunate pattern we have seen in several other significant terrorism
cases: • Chicago resident David Coleman Headley was central to the planning of the 2008
terrorist attacks in Mumbai that killed 166 people. Yet, following the 9/11 attacks, U.S.
authorities received plausible tips regarding Headley’s associations with militant groups at least
five times from his family members, friends, and acquaintances.66 These multiple tips were never
followed up in an effective fashion. • Maj. Nidal Hasan, a U.S. Army psychiatrist, killed 13 people
at Fort Hood , Texas, in 2009. Before the attack, U.S. intelligence agencies had intercepted
multiple emails between Maj. Hasan and Anwar al-Awlaki, a U.S.- born cleric living in Yemen
who was notorious for his ties to militants. The emails included a discussion of the permissibility
in Islam of killing U.S. soldiers. Counterterrorism investigators didn’t follow up on these emails,
believing that they were somehow consistent with Maj. Hasan’s job as a military psychiatrist.67 •
Carlos Bledsoe, a convert to Islam, fatally shot a soldier at a Little Rock , Ark., military
recruiting office in 2009, several months after returning from a stay in Yemen. As a result of that
trip, Bledsoe was under investigation by the FBI. Yet, he was still able to buy the weapons for his
deadly attack when he was back in the United States.68 • Nigerian Umar Farouq Abdulmutallab
attempted to blow up Northwest Flight 253 over Detroit on Christmas Day 2009 with an
“underwear bomb.” Fortunately, the bomb failed to explode. Yet, a few weeks before the botched
attack, Abdulmutallab’s father contacted the U.S. Embassy in Nigeria with concerns that his son
had become radicalized and might be planning something.69 This information wasn’t further
investigated. Abdulmutallab had been recruited by al-Qaeda’s branch in Yemen for the mission.
The White House review of the bomb plot concluded that there was sufficient information known
to the U.S. government to determine that Abdulmutallab was likely working for al-Qaeda in
Yemen and that the group was looking to expand its attacks beyond Yemen.70 Yet, Abdulmutallab
was allowed to board a plane bound for the United States without any question. All of the
missed opportunities in these serious terrorism cases argue not for the gathering
of ever-more vast troves of information, but simply for a better understanding of
the information the government has already collected that was derived from
conventional law enforcement and intelligence methods.
Network and pattern identification fails
Keefe, 6
(Patrick Radden, Century Foundation fellow, author of 'Chatter: Dispatches from the Secret
World of Global Eavesdropping’, 3-12-2006, New York Times, “Can Network Theory Thwart
Terrorists?”, lexis, amp)
Network academics caution that the field is still in its infancy and should not be
regarded as a panacea . Duncan Watts of Columbia University points out that it's
much easier to trace a network when you can already identify some of its
members. But much social-network research involves simply trawling large databases
for telltale behaviors or activities that might be typical of a terrorist. In this case the
links among people are not based on actual relationships at all, but on an
''affiliation network,'' in which individuals are connected by virtue of taking part in a
similar activity. This sort of approach has been effective for corporations in
detecting fraud. A credit-card company knows that when someone uses a card to purchase $2
of gas at a gas station, and then 20 minutes later makes an expensive purchase at an electronics
store, there's a high probability that the card has been stolen. Marc Sageman, a former C.I.A. case
officer who wrote a book on terror networks, notes that correlating certain signature behaviors
could be one way of tracking terrorists: jihadist groups in Virginia and Australia exercised at
paint-ball courses, so analysts could look for Muslim militants who play paint ball, he suggests.
But whereas there is a long history of signature behaviors that indicate fraud, jihadist
terror networks are a relatively new phenomena and offer fewer
reliable patterns .
There is also some doubt that identifying hubs will do much good. Networks are by
their very nature robust and resistant to attack. After all, while numerous high ranking
Qaeda leaders have been captured or killed in the years since Sept. 11, the network still
appears to be functioning. ''If you shoot the C.E.O., they'll hire another
one ,'' Duncan Watts says. ''The job will still get done.''
A2: Congress Checks Overload
Congress can’t check overload
Shoemaker, 15
Tim Shoemaker, the Director of Legislation at Campaign for Liberty. He graduated Magna Cum
Laude with a Bachelor of Arts from Indiana University of Pennsylvania. “Can Congress Effectively
Oversee the Vast Surveillance State,” Campaign for Liberty, 4/8/15,
http://www.campaignforliberty.org/can-congress-effectively-oversee-vast-surveillance-state //
IS
According to a new report from the Associated Press, the Senate Intelligence
Committee is creating a sort of "secret encyclopedia" of America's surveillance
programs.
Surprisingly, this hasn't picked up as much media attention as it should.
What the report actually tells us, without directly saying so, is Congress isn't
capable of conducting informed, effective oversight of the surveillance state.
Despite calling Snowden's actions "treason" at the time, it's clear that Feinstein and other
members of Congress were completely unaware of the foreign surveillance being
conducted under Executive Order 12333 -- and would never have learned of the
programs being carried out by a small number of Executive Branch employees
without his whistleblowing activities.
Of course, what ought to upset us all is how the Intel Committee members HAD been
briefed on some of the most controversial intelligence programs such as the
surveillance of American's phone records and the PRISM program and other than
Ron Wyden and Mark Udall, none of them seemed to be overly concerned
about how Americans' civil liberties were being routinely violated.
A2: wastes $
Mass data collection is cost efficient
Cohen, ’14 (Drew F. Cohen, former law clerk to the chief justice of the Constitutional Court of
South Africa “The Shrinking Price of Mass Surveillance” February 10, 2014
http://nationalinterest.org/commentary/the-shrinking-price-mass-surveillance-9848?page=2)
//GY
By now most Americans agree that the NSA surveillance program, brought to light by leaked
documents from former NSA contractor Edward Snowden, went “too far.” Just because the spy
agency could keep tabs on every American, did not mean it should. The price of surveillance
technology, however, had dropped so precipitously over the last two decades that
once the agency overcame any moral objections it had about the program, few practical
considerations stood in its way of implementing a system that could monitor 315 million
Americans every day. Indeed, one estimate tagged the NSA’s annual surveillance costs
at $574 per taxpayer, amounting to a paltry six-and-a-half cents an hour.¶ If privacy
law experts Kevin S. Bankston and Ashkan Soltani are correct, costs, once a significant check on
government spying and police monitoring efforts, have become an afterthought. In a recent study
published in the Yale Law Journal Online, Bankston and Soltani found that most technologies
deployed for mass-surveillance efforts by police departments (e.g., GPS devices and domestic
drones) exhibit similar cost structures to the NSA spying program: as the number of subjects
increase, the cost of keeping tabs each target nears zero. Cheaper, more effective
tracking devices have been a boon to cash-strapped police departments
nationwide, largely to the dismay of civil-liberties groups.
…
Newer surveillance technologies, however, were significantly cheaper when costs
were tabulated on an hourly basis. The total price tag of tracking a suspect using a GPS
device, similar to the one in Jones for instance, came out to $10 an hour over one day, $1.43 per
hour over a week and $0.36 per hour over a month. Another relatively new technique, obtaining a
suspect’s location through their cell phone signal with the carrier’s assistance, yielded similar
results. As of August 2009, fees for obtaining cell phone location data from carriers ranged from
$0.04 to $4.17 per hour for one month of surveillance. (Sprint, for example, charges $30 per
month per target while T-Mobile charges $100 per day).¶ After tabulating their results,
Bankston and Soltani concluded that the total cost of using a GPS device to track
as suspect over twenty-eight days (the method rejected in Jones) was roughly
three hundred times less expensive than the same tracking using a transmitter
(technology approved by the Supreme Court) and 775 times less expensive than using the
five car pursuit method (also approved). Meanwhile, the cost of using transmittersurveillance technology was only 2.5 times less expensive than undercover car pursuit.
…
A2: Big Data Initiative
BDI isn’t enough – it’s just a first step
Kirby, 13
Bob Kirby, vice president of sales for CDW·G, a leading technology provider to government and
education. “Big Data Can Help the Federal Government Move Mountains. Here's How.,” FedTech
Magazine, 08/01/13, http://www.fedtechmagazine.com/article/2013/08/big-data-can-helpfederal-government-move-mountains-heres-how // IS
The challenges to maximizing the use of Big Data are great, but the opportunities
it opens are even greater. The White House initiative is a good first step, but
agencies must mobilize. Several federal projects have established a path that
others can follow to make the most of these opportunities. When they invest in
technologies to help them deal with the massive piles of data they are creating, agencies may find
that they can move mountains.
A2: CIA
CIA mass surveillance fails
Fingas, 4/25 (Jon Fingas, Associate Editor Engadget “The CIA couldn't properly use a mass
surveillance program for years” April 25th 2015 http://www.engadget.com/2015/04/25/ciamass-surveillance-problems/) //GY
Whatever you think about the morality of using mass surveillance to catch evildoers, the
technology only works if people can use it -- just ask the CIA. The New York
Times has obtained a declassified report revealing that that the agency was
largely kept in the dark about the President's Surveillance Program (aka Stellarwind), which
allows for bulk data collection, until at least 2009. Only the highest-ranking officials could
use PSP as a general rule, and those few agents that did have access often didn't know
enough to use it properly, faced "competing priorities" or had other tools at their
disposal. To boot, there wasn't documentation showing how effective the program
was in fighting terrorism.¶ It's not certain if the CIA has shaped up in the years since that
report, although its shift toward online operations is going to make these kinds of digital
initiatives more important. Regardless of any improvements, it's clearer than ever that
the US government has sometimes had private doubts about the effectiveness of its
large-scale surveillance efforts.
Surveillance can’t solve terror
Mass surveillance can’t find terrorists – mathematically impossible
Rudmin, ’06 (FLOYD RUDMIN Professor of Social & Community Psychology at the
University of Tromsø in Norway “Why Does the NSA Engage in Mass Surveillance of Americans
When It’s Statistically Impossible for Such Spying to Detect Terrorists?” MAY 24, 2006
http://www.counterpunch.org/2006/05/24/why-does-the-nsa-engage-in-mass-surveillance-ofamericans-when-it-s-statistically-impossible-for-such-spying-to-detect-terrorists/) //GY
The Bush administration and the National Security Agency (NSA) have been secretly monitoring
the email messages and phone calls of all Americans. They are doing this, they say, for our own
good. To find terrorists. Many people have criticized NSA’s domestic spying as
unlawful invasion of privacy, as search without search warrant, as abuse of
power, as misuse of the NSA’s resources, as unConstitutional, as something the
communists would do, something very unAmerican.¶ In addition, however, mass
surveillance of an entire population cannot find terrorists. It is a probabilistic
impossibility. It cannot work.¶ What is the probability that people are terrorists given that
NSA’s mass surveillance identifies them as terrorists? If the probability is zero (p=0.00), then
they certainly are not terrorists, and NSA was wasting resources and damaging the lives of
innocent citizens. If the probability is one (p=1.00), then they definitely are terrorists, and NSA
has saved the day. If the probability is fifty-fifty (p=0.50), that is the same as guessing the flip of a
coin. The conditional probability that people are terrorists given that the NSA
surveillance system says they are, that had better be very near to one (p_1.00)
and very far from zero (p=0.00).¶ The mathematics of conditional probability were figured
out by the Scottish logician Thomas Bayes. If you Google "Bayes’ Theorem", you will get more
than a million hits. Bayes’ Theorem is taught in all elementary statistics classes. Everyone at NSA
certainly knows Bayes’ Theorem.¶ To know if mass surveillance will work, Bayes’ theorem requires
three estimations:¶ 1) The base-rate for terrorists, i.e. what proportion of the population are
terrorists.¶ 2) The accuracy rate, i.e., the probability that real terrorists will be identified by NSA;¶
3) The misidentification rate, i.e., the probability that innocent citizens will be misidentified by
NSA as terrorists.¶ No matter how sophisticated and super-duper are NSA’s methods
for identifying terrorists, no matter how big and fast are NSA’s computers, NSA’s
accuracy rate will never be 100% and their misidentification rate will never be
0%. That fact, plus the extremely low base-rate for terrorists, means it is logically impossible
for mass surveillance to be an effective way to find terrorists.¶ I will not put Bayes’
computational formula here. It is available in all elementary statistics books and is on the web
should any readers be interested. But I will compute some conditional probabilities that people
are terrorists given that NSA’s system of mass surveillance identifies them to be terrorists. ¶ The
US Census shows that there are about 300 million people living in the USA. ¶ Suppose that there
are 1,000 terrorists there as well, which is probably a high estimate. The base-rate would be 1
terrorist per 300,000 people. In percentages, that is .00033% which is way less than 1%. Suppose
that NSA surveillance has an accuracy rate of .40, which means that 40% of real terrorists in the
USA will be identified by NSA’s monitoring of everyone’s email and phone calls. This is probably a
high estimate, considering that terrorists are doing their best to avoid detection. There is no
evidence thus far that NSA has been so successful at finding terrorists. And
suppose NSA’s misidentification rate is .0001, which means that .01% of innocent
people will be misidentified as terrorists, at least until they are investigated,
detained and interrogated. Note that .01% of the US population is 30,000 people. With these
suppositions, then the probability that people are terrorists given that NSA’s system
of surveillance identifies them as terrorists is only p=0.0132, which is near zero,
very far from one. Ergo, NSA’s surveillance system is useless for finding
terrorists.¶ Suppose that NSA’s system is more accurate than .40, let’s say, .70, which means
that 70% of terrorists in the USA will be found by mass monitoring of phone calls and email
messages. Then, by Bayes’ Theorem, the probability that a person is a terrorist if targeted by NSA
is still only p=0.0228, which is near zero, far from one, and useless.¶ Suppose that NSA’s system is
really, really, really good, really, really good, with an accuracy rate of .90, and a misidentification
rate of .00001, which means that only 3,000 innocent people are misidentified as terrorists. With
these suppositions, then the probability that people are terrorists given that NSA’s
system of surveillance identifies them as terrorists is only p=0.2308, which is far
from one and well below flipping a coin. NSA’s domestic monitoring of everyone’s
email and phone calls is useless for finding terrorists.¶ NSA knows this. Bayes’ Theorem
is elementary common knowledge. So, why does NSA spy on Americans knowing it’s not possible
to find terrorists that way? Mass surveillance of the entire population is logically
sensible only if there is a higher base-rate. Higher base-rates arise from two lines
of thought, neither of them very nice:¶ 1) McCarthy-type national paranoia;¶ 2)
political espionage.¶ The whole NSA domestic spying program will seem to work well, will
seem logical and possible, if you are paranoid. Instead of presuming there are 1,000 terrorists in
the USA, presume there are 1 million terrorists. Americans have gone paranoid before, for
example, during the McCarthyism era of the 1950s. Imagining a million terrorists in America puts
the base-rate at .00333, and now the probability that a person is a terrorist given that NSA’s
system identifies them is p=.99, which is near certainty. But only if you are paranoid. If NSA’s
surveillance requires a presumption of a million terrorists, and if in fact there are only 100 or only
10, then a lot of innocent people are going to be misidentified and confidently
mislabeled as terrorists.¶ The ratio of real terrorists to innocent people in the prison camps of
Guantanamo, Abu Ghraib, and Kandahar shows that the US is paranoid and is not bothered by
mistaken identifications of innocent people. The ratio of real terrorists to innocent people on
Bush’s no-fly lists shows that the Bush administration is not bothered by mistaken identifications
of innocent Americans.¶ Also, mass surveillance of the entire population is logically
plausible if NSA’s domestic spying is not looking for terrorists, but looking for
something else, something that is not so rare as terrorists. For example, the May 19
Fox News opinion poll of 900 registered voters found that 30% dislike the Bush administration so
much they want him impeached. If NSA were monitoring email and phone calls to identify proimpeachment people, and if the accuracy rate were .90 and the error rate were .01, ¶ then the
probability that people are pro-impeachment given that NSA surveillance system identified them
as such, would be p=.98, which is coming close to certainty (p_1.00). Mass surveillance by NSA of
all Americans’ phone calls and emails would be very effective for domestic political intelligence. ¶
But finding a few terrorists by mass surveillance of the phone calls and email
messages of 300 million Americans is mathematically impossible, and NSA
certainly knows that.
Effectiveness of mass surveillance is massively overblown – hasn’t
aided a single case
Bergen et al, PhD, ’14 (Bailey Cahall David Sterman Emily Schneider Peter Bergen,
member of the Homeland Security Project, a successor to the 9/11 Commission a Professor of
Practice at Arizona State University and a fellow at Fordham University's Center on National
Security “DO NSA'S BULK SURVEILLANCE PROGRAMS STOP TERRORISTS?” JANUARY 13,
2014 https://www.newamerica.org/international-security/do-nsas-bulk-surveillance-programsstop-terrorists/) //GY
However, our review of the government’s claims about the role that NSA “bulk”
surveillance of phone and email communications records has had in keeping the
United States safe from terrorism shows that these claims are overblown and
even misleading. An in-depth analysis of 225 individuals recruited by al-Qaeda or a likeminded group or inspired by al-Qaeda’s ideology, and charged in the United States with an act of
terrorism since 9/11, demonstrates that traditional investigative methods, such as the use
of informants, tips from local communities, and targeted intelligence operations , provided the
initial impetus for investigations in the majority of cases, while the contribution
of NSA’s bulk surveillance programs to these cases was minimal . Indeed, the
controversial bulk collection of American telephone metadata, which includes the telephone
numbers that originate and receive calls, as well as the time and date of those calls but not their
content, under Section 215 of the USA PATRIOT Act, appears to have played an
identifiable role in initiating, at most, 1.8 percent of these cases. NSA programs
involving the surveillance of non-U.S. persons outside of the United States under Section 702 of
the FISA Amendments Act played a role in 4.4 percent of the terrorism cases we examined, and
NSA surveillance under an unidentified authority played a role in 1.3 percent of the cases we
examined.¶ Regular FISA warrants not issued in connection with Section 215 or Section 702,
which are the traditional means for investigating foreign persons, were used in at least 48 (21
percent) of the cases we looked at, although it’s unclear whether these warrants played an
initiating role or were used at a later point in the investigation. (Click on the link to go to a
database of all 225 individuals, complete with additional details about them and the government’s
investigations of these cases: http://natsec.newamerica.net/nsa/analysis).¶ Surveillance of
American phone metadata has had no discernible impact on preventing acts of terrorism and only
the most marginal of impacts on preventing terrorist-related activity, such as fundraising for a
terrorist group. Furthermore, our examination of the role of the database of U.S. citizens’
telephone metadata in the single plot the government uses to justify the importance of the
program – that of Basaaly Moalin, a San Diego cabdriver who in 2007 and 2008 provided $8,500
to al-Shabaab, al-Qaeda’s affiliate in Somalia – calls into question the necessity of the Section 215
bulk collection program. According to the government, the database of American phone metadata
allows intelligence authorities to quickly circumvent the traditional burden of proof associated
with criminal warrants, thus allowing them to “connect the dots” faster and prevent future 9/11scale attacks. Yet in the Moalin case, after using the NSA’s phone database to link a number
in Somalia to Moalin, the FBI waited two months to begin an investigation and
wiretap his phone. Although it’s unclear why there was a delay between the NSA tip and the
FBI wiretapping, court documents show there was a two-month period in which the
FBI was not monitoring Moalin’s calls, despite official statements that the bureau
had Moalin’s phone number and had identified him. , This undercuts the government’s
theory that the database of Americans’ telephone metadata is necessary to expedite the
investigative process, since it clearly didn’t expedite the process in the single case the
government uses to extol its virtues.¶ Additionally, a careful review of three of the
key terrorism cases the government has cited to defend NSA bulk surveillance
programs reveals that government officials have exaggerated the role of the NSA
in the cases against David Coleman Headley and Najibullah Zazi, and the significance of the
threat posed by a notional plot to bomb the New York Stock Exchange. ¶ In 28 percent of the
cases we reviewed, court records and public reporting do not identify which
specific methods initiated the investigation. These cases, involving 62 individuals, may
have been initiated by an undercover informant, an undercover officer, a family member tip,
other traditional law enforcement methods, CIA- or FBI-generated intelligence, NSA surveillance
of some kind, or any number of other methods. In 23 of these 62 cases (37 percent), an informant
was used. However, we were unable to determine whether the informant initiated the
investigation or was used after the investigation was initiated as a result of the use of some other
investigative means. Some of these cases may also be too recent to have developed a public record
large enough to identify which investigative tools were used.¶ We have also identified three
additional plots that the government has not publicly claimed as NSA successes, but in which
court records and public reporting suggest the NSA had a role. However, it is not clear whether
any of those three cases involved bulk surveillance programs.¶ Finally, the overall problem for
U.S. counterterrorism officials is not that they need vaster amounts of
information from the bulk surveillance programs, but that they don’t sufficiently
understand or widely share the information they already possess that was derived
from conventional law enforcement and intelligence techniques. This was true for
two of the 9/11 hijackers who were known to be in the United States before the attacks on New
York and Washington, as well as with the case of Chicago resident David Coleman Headley, who
helped plan the 2008 terrorist attacks in Mumbai, and it is the unfortunate pattern we have also
seen in several other significant terrorism cases.
Surveillance can’t solve – their authors inflate data
Nicks, ’14 (Denver Nicks, TIME, “Report: Usefulness of NSA Mass Surveillance ‘Overblown’”
Jan. 13, 2014 http://swampland.time.com/2014/01/13/report-usefulness-of-nsa-masssurveillance-overblown/) //GY
Ever since Edward Snowden’s leaks began revealing the extent of the National Security Agency’s
mass surveillance—or “bulk collection”—programs last June, officials have defended the programs
with one number: 50.¶ “We know of at least 50 threats that have been averted because of this
information not just in the United States, but, in some cases, threats here in Germany. So lives
have been saved,” President Obama said on a visit to Berlin. NSA Director Gen. Keith Alexander
made the same claim testifying before Congress.¶ But a new study out Monday from The
New America Foundation says that claim is simply false, calling it “overblown,
and even misleading.”¶ “Surveillance of American phone metadata has had no
discernible impact on preventing acts of terrorism and only the most marginal of
impacts on preventing terrorist-related activity, such as fundraising for a terrorist
group,” says the nonpartisan think tank’s report, titled “Do NSA’s Bulk Surveillance Programs
Stop Terrorists?Ӧ In an analysis of 225 al-Qaeda-linked individuals charged with
terrorism in the U.S. since 9/11, the report found NSA mass surveillance of
Americans telephone records—authorized under Section 215 of the USA
PATRIOT Act—“played an identifiable role in initiating, at most, 1.8 percent” of
investigations.¶ The report acknowledges that in 28 percent of cases it reviewed, researchers
couldn’t determine what methods initiated the investigation. But in many of those cases an
informant played a role in the investigation, says the report.¶ ACLU Legislative Counsel Michelle
Richardson told TIME the report “confirms that the numbers and examples the
government has floated in support of its domestic spying programs are grossly
inflated. More broadly though, it underlines how far the government has actually gotten away
from the original lessons of 9/11. Instead of working on connecting the dots collected from
traditional investigations, it has become obsessed with collecting ever more data
whether it is useful or not.”
Surveillance can’t solve – their authors are lying
Terbush, ’13 (Jon Terbush, correspondent The Week, “Is the NSA's data snooping actually
effective?” December 19, 2013 http://theweek.com/articles/453981/nsas-data-snooping-actuallyeffective) //GY
The White House on Wednesday released a much-anticipated independent review of the National
Security Agency's spy programs, which offered 46 recommendations for reforming the agency's
spy ops.¶ The report concluded that the programs, though they had gone too far, should stay in
place. But it nevertheless may have undermined the NSA's claim that the collection of all phone
metadata is a necessary tool to combat terrorism.¶ "Our review suggests that the
information contributed to terrorist investigations by the use of section 215
telephony metadata was not essential to preventing attacks and could readily
have been obtained in a timely manner using conventional section 215 orders," the
report said.¶ That finding came just days after a federal judge ruled that the phone data collection
program was "likely unconstitutional." Moreover, he wrote in his decision that, for all the
government's bluster, there was no indication the program had actually produced
tangible results.¶ "The government does not cite a single case in which analysis of
the NSA's bulk metadata collection actually stopped an imminent terrorist
attack," Judge Richard Leon wrote.¶ Given the limited record before me at this point in the
litigation — most notably, the utter lack of evidence that a terrorist attack has ever been prevented
because searching the NSA database was faster than other investigative tactics — I have serious
doubts about the efficacy of the metadata collection program as a means of
conducting time-sensitive investigations in cases involving imminent threats of
terrorism. [PDF]¶ Granted, the collection of phone data is just one of the NSA's many oncesecret tools. And unsurprisingly, the White House, hawkish lawmakers, and those who oversee
the spy programs have repeatedly claimed that the NSA's programs in their entirety have proven
crucial to snuffing out terror plots.¶ Shortly after whistleblower Edward Snowden's leaks turned
up in the press, NSA Director Gen. Keith Alexander defended his agency's surveillance practices
before the Senate. The surveillance programs, he said, had stopped dozens of attacks at home and
abroad, including a 2009 plot to bomb the New York City subway system.¶ Obama, too, said back
in June that the programs had thwarted "at least 50" possible attacks. (He and others often cite 54
as an exact number.) He also defended the tradeoff of civil liberties for security as a necessary one
— "we have to make choices as a society" — adding that the programs were merely "modest
encroachments on privacy."¶ However, two prominent critics of the NSA, Democratic Sens. Ron
Wyden (Ore.) and Mark Udall (Colo.), challenged that assessment in a joint statement following
Alexander's testimony.¶ "We have not yet seen any evidence showing that the NSA's
dragnet collection of Americans' phone records has produced any uniquely
valuable intelligence," they wrote.¶ Alexander had only specified a couple of the supposed
"dozens" of instances in which NSA spying thwarted terror plots. The two senators added in a
subsequent statement that it appeared the government had actually uncovered those
plots via other investigative tools, and that the NSA's data snooping had "played
little or no role in most of these disruptions."¶ A ProPublica investigation earlier this year
likewise determined that there was "no evidence that the oft-cited figure [of 54
disrupted plots] is accurate," and that the NSA was often "inconsistent on how
many plots it has helped prevent and what role the surveillance programs
played."¶ It's pretty much impossible to say who's right with any certainty. A full explanation of
the supposedly disrupted plots remains classified, so only some lawmakers and those involved in
the programs know exactly how effective they've been.¶ Still, it's likely that the NSA has at
least overstated the effectiveness of its tools, particularly in comparison to the
sheer scale of the spying. Sen. Patrick Leahy (D-Vt.), after reviewing the full classified list of
thwarted plots, concluded the programs had "value," but that the 54 figure was a gross
exaggeration.¶ "That's plainly wrong," he said at a July hearing. "These weren't all
plots and they weren't all thwarted."
Surveillance doesn’t work – can’t combat terror
Tufnell, Wired, ’14 (NICHOLAS TUFNELL, Wired, “NSA bulk surveillance has 'no
discernible impact' on the prevention of terrorism” 14 JANUARY 14
http://www.wired.co.uk/news/archive/2014-01/14/naf-report-on-nsa) //GY
The New America Foundation (NAF) has released a damning report claiming the NSA's mass
surveillance programme has "no discernible impact" on the prevention of
terrorism.¶ The report, "Do NSA's Bulk Surveillance Programs Stop Terrorists?", also claims that
the NSA is guilty of repeatedly exaggerating the efficacy of its bulk surveillance
techniques in addition to misleading the public over aspects of 9/11, and of failing
to prevent crime efficiently due to an insufficient understanding of its own
intelligence already sourced by traditional means.¶ The NAF describes itself as a nonprofit, nonpartisan public policy institute and think tank focussing on a wide range of issues,
including national security studies. Investigating claims made by the US government concerning
the competence and effectiveness of the NSA's bulk surveillance since 9/11, the NAF report
compiled a database of 225 people from the US, including US nationals abroad, who have been
indicted, convicted, or killed since the 9/11 terror attacks.¶ Key methods used to initiate
investigations on these individuals were identified by the report and divided into eight separate
categories:¶ "Those cases in which the initiating or key role was played by the bulk collection of
American telephone metadata under Section 215; NSA surveillance of non-US persons overseas
under Section 702; NSA surveillance under an unknown authority; tips from the extremist's
family or local community members; tips regarding suspicious activity from individuals who were
not part of an extremist's family or local community; the use of an undercover informant; the
routine conduct of law enforcement or intelligence operations in which the NSA did not play a key
role; and self-disclosure of extremist activity on the part of the extremist in question."¶ The
report also acknowledges that the public records from which it drew the
information may be incomplete and that there is reason to believe the
government has actively concealed the role of NSA programmes in some
investigations: "Drug Enforcement Administration (DEA) agents have been trained in some
instances, for example, to conceal the role of a DEA unit that analysed metadata to initiate cases." ¶
The bulk collection of US citizens' telephone metadata -- which includes phone
numbers, both incoming and outgoing, as well as the exact time, date and duration of the calls
(but not the content) under Section 215 of the US Patriot Act -- accounts for having aided
only 1.8 percent of the NSA's terrorist cases.¶ An equally unimpressive 4.4 percent of
terrorism cases were aided by the NSA's surveillance of non-US persons outside of the United
States under Section 702 of the FISA Amendments Act.¶ Commenting on these figures the
report states, "Surveillance of American phone metadata has had no discernible
impact on preventing acts of terrorism and only the most marginal of impacts on
preventing terrorist related activity, such as fundraising for a terrorist group."¶ Of
the terrorist plot regularly cited by the US government as evidence of the necessity and success of
its surveillance techniques -- namely, Basaaly Moalin, a San Diego taxi driver who provided
$8,500 (£5,171) to an al-Qaeda affiliate in Somalia -- the NAF report states that the NSA's actions
contradict its claims that the expediency afforded by Section 215 was largely responsible for the
success of Moalin's capture.¶ "According to the government, the database of American phone
metadata allows intelligence authorities to quickly circumvent the traditional burden of proof
associated with criminal warrants, thus allowing them to 'connect the dots' faster and prevent
future 9/11-scale attacks. Yet in the Moalin case, after using the NSA's phone database to link a
number in Somalia to Moalin, the FBI waited two months to begin an investigation and wiretap
his phone."¶ The reasons behind the two-month delay -- during which time the FBI was not
monitoring Moalin's calls, despite being aware of his number and identity -- are still unclear.
What is clear, however, is that the bulk surveillance programme did not expedite the
investigative process, despite the US government's claims to the contrary. ¶ The
report also reviewed three key terrorism cases frequently cited by the US
government in defence of the NSA's bulk surveillance. It concluded that government
officials exaggerated the role of the NSA in the cases against David Coleman Headley and
Najibullah Zazi. The significance of the threat of Zazi, who planned to bomb the New
York Stock Exchange, was also exaggerated, claims the report.¶ More emphasis, the
report suggests, should be placed on conventional forms of law enforcement,
which are demonstrably more efficient. "The overall problem for US counterterrorism
officials is not that they need vaster amounts of information from the bulk surveillance programs,
but that they don't sufficiently understand or widely share the information they already possess
that was derived from conventional law enforcement and intelligence techniques."
Can’t solve – studies and Snowden
Osterndorf, 3/17 (Chris Osterndorf, writer Daily Dot, “Edward Snowden is right—NSA
surveillance won't stop terrorism” Mar 17, 2015 http://www.dailydot.com/opinion/edwardsnowden-mass-surveillance-nsa-america/) //GY
It appears that Snowden season is approaching once again.¶ The controversial whistleblower
made a surprise appearance via Google Hangout at SXSW this week, where his remarks proved
captivating as always. Essentially a less flashy sequel to his ACLU speech from 2014, Snowden
only spoke to a few people this time around, engaging in a conversation with a select group of
leaders from America’s tech sector. In particular, he urged tech companies to become "champions
of privacy," suggesting that they use their power to help shield Americans from an increasingly
watchful government.¶ In addition to speaking at SXSW in Austin, Snowden also said a few words
at FutureFest in London, where he warned that massive surveillance won't stop
terrorism. In this instance, Snowden is absolutely correct, and it’s time we start heeding
his advice.¶ At this point, the only people clinging to this idea is an effective is the NSA themselves.
In 2013, NSA Director Gen. Keith Alexander went before the House Intelligence Committee to
testify to claim that increased surveillance had helped to stop terrorist threats over 50 times since
9/11, including attacks on U.S. soil such as a plot to blow up the New York Stock Exchange and a
defunct scheme to fund an overseas terrorist group. ¶ Other witnesses in the same hearing also
suggested that the Snowden leaks had harmed America greatly. “We are now faced with a
situation that because this information has been made public, we run the risk of losing these
collection capabilities,” stated Robert S. Litt, general counsel of the Office of the Director of
National Intelligence. “We’re not going to know for many months whether these leaks in fact have
caused us to lose these capabilities, but if they do have that effect, there is no doubt that they will
cause our national security to be affected.Ӧ However, the details the NSA provided in this hearing
were somewhat hazy, and a closer look at the numbers indicates the benefits of
increased surveillance may not be so clear-cut after all. Research from International
Security found that out of the 269 terrorist suspects apprehended since 9/11, 158 were brought in
through the use of traditional investigative measures. That’s almost 60 percent of all who were
arrested. Meanwhile, 78 suspects were apprehended through measures which were “unclear” and
15 were implicated in plots but were not apprehended, while the remaining 18 were apprehended
by some form of NSA surveillance.¶ Eighteen is no small number when you’re discussing matters
of national security; however, the above statistics do not necessarily indicate that mass
surveillance was responsible for the apprehension of these 18 terrorists or
whether these suspects were detained under more traditional surveillance
measures. Moreover, the evidence suggests that traditional means of combatting
terrorism are more effective than surveillance when it comes to overall arrests.¶
Additional analysis from the New America Foundation further supports these findings.
Examining 225 post-9/11 terrorism cases in the U.S., their 2014 report found that the NSA’s
bulk surveillance program “has had no discernible impact on preventing acts of
terrorism,” citing traditional methods of law enforcement and investigation as
being far more effective in the majority of cases. In as many as 48 of these cases,
traditional surveillance warrants were used to collect evidence, while more than half of the cases
were the product of other traditional investigative actions, such as informants and reports of
suspicious activity. ¶ In fact, New America determined that the NSA has only been responsible for
7.5 percent of all counterterrorism investigations and that only one of those investigations
led to suspects being convicted based on metadata collection. And that case, which
took months to solve, as the NSA went back and forth with the FBI, involved money being sent to
a terrorist group in Somalia, rather than an active plan to perpetrate an attack on U.S. soil. ¶
According to the report’s principal author Peter Bergen, who is the director of the foundation’s
National Security Program and their resident terrorism expert, the issue has less to do with the
collection of data and more to do with the comprehension of it. Bergen said, “ The overall
problem for U.S. counterterrorism officials is not that they need vaster amounts
of information from the bulk surveillance programs, but that they don’t
sufficiently understand or widely share the information they already possess that
was derived from conventional law enforcement and intelligence techniques.Ӧ Of
course, even when all of the data has been collected, it still isn’t enough to stop a
terrorist attack. “It’s worth remembering that the mass surveillance programs initiated by the
U.S. government after the 9/11 attacks—the legal ones and the constitutionally dubious ones—
were premised on the belief that bin Laden’s hijacker-terrorists were able to pull off the attacks
because of a failure to collect enough data,” asserts Reason’s Patrick Eddington. “Yet in their
subsequent reports on the attacks, the Congressional Joint Inquiry (2002) and the 9/11
Commission found exactly the opposite. The data to detect (and thus foil) the plots was
in the U.S. government’s hands prior to the attacks; the failures were ones of
sharing, analysis, and dissemination.” So once again, we see that the key is not
collection, but comprehension.¶ If all of this still doesn’t seem like enough
evidence that mass surveillance is ineffective, consider that a White House review
group has also admitted the NSA’s counterterrorism program “was not essential
to preventing attacks” and that a large portion of the evidence that was collected
“could readily have been obtained in a timely manner using conventional [court]
orders.”¶ But mass surveillance isn’t just the United States’ problem. Research has shown that
Canada's Levitation project, which also involves collecting large amounts of data in the service of
fighting terrorism, may be just as questionable as the NSA’s own data collection practices.
Meanwhile, in response to the Charlie Hebdo attacks in Paris, British Prime Minister David
Cameron has reintroduced the Communications Data Bill, which would force telecom companies
to keep track of all Internet, email, and cellphone activity and ban encrypted communication
services. ¶ But support for this type of legislation in Europe doesn't appear to be any stronger than
in North America. Slate’s Ray Corrigan argued, “Even if your magic terrorist-catching machine
has a false positive rate of 1 in 1,000—and no security technology comes anywhere near this—
every time you asked it for suspects in the U.K., it would flag 60,000 innocent people.” ¶
Fortunately, the cultural shift against increased data collection has become so evident in the U.S.
that even President Obama is trying to get out of the business of mass surveillance; the president
announced plans last March to reform the National Security Agency's practice of collecting call
records, which have yet to come to fruition.¶ Benjamin Franklin famously said that “those who
would give up essential liberty to purchase a little temporary safety deserve neither liberty nor
safety.” While this quote has been notoriously butchered and misinterpreted over the years, it has
now become evident that we shouldn’t have to give up either of these things in pursuit of the
other. The U.S. is still grappling with how to fight terrorism in this technologically advanced age,
but just because we have additional technology at our disposal, doesn’t mean that technology is
always going to be used for the common good. You may believe Edward Snowden to be a traitor or
a hero, but on this matter, there is virtually no question: Mass surveillance is not only
unconstitutional, it is also the wrong way to fight terrorism.
NSA good - cyberterror
The NSA is key to check cyber threats
Goldsmith, 13
Jack Goldsmith, Henry L. Shattuck Professor at Harvard Law School, “We Need an Invasive
NSA,” New Republic, 10/10/13, http://www.newrepublic.com/article/115002/invasive-nsa-willprotect-us-cyber-attacks // IS
Such cyber-intrusions threaten corporate America and the U.S. government every
day. “Relentless assaults on America’s computer networks by China and other
foreign governments, hackers and criminals have created an urgent need for
safeguards to protect these vital systems,” the Times editorial page noted last year while
supporting legislation encouraging the private sector to share cybersecurity information with the
government. It cited General Keith Alexander, the director of the NSA, who had
noted a 17-fold increase in cyber-intrusions on critical infrastructure from 2009
to 2011 and who described the losses in the United States from cyber-theft as “the
greatest transfer of wealth in history.” If a “catastrophic cyber-attack occurs,” the
Timesconcluded, “Americans will be justified in asking why their lawmakers ... failed to protect
them.”
When catastrophe strikes, the public will adjust its tolerance for intrusive government measures.
The Times editorial board is quite right about the seriousness of the cyber- threat and the federal
government’s responsibility to redress it. What it does not appear to realize is the connection
between the domestic NSA surveillance it detests and the governmental assistance with
cybersecurity it cherishes. To keep our computer and telecommunication networks
secure, the government will eventually need to monitor and collect intelligence
on those networks using techniques similar to ones the Timesand many others
find reprehensible when done for counterterrorism ends.
The fate of domestic surveillance is today being fought around the topic of
whether it is needed to stop Al Qaeda from blowing things up. But the fight
tomorrow, and the more important fight, will be about whether it is necessary to
protect our ways of life embedded in computer networks
Anyone anywhere with a connection to the Internet can engage in cyberoperations within the United States. Most truly harmful cyber-operations,
however, require group effort and significant skill. The attacking group or nation
must have clever hackers, significant computing power, and the sophisticated
software—known as “malware”—that enables the monitoring, exfiltration, or destruction of
information inside a computer. The supply of all of these resources has been growing
fast for many years—in governmental labs devoted to developing these tools and on sprawling
black markets on the Internet.
The NSA is key to check cyber threats – it outweighs other agencies
Finley, 10
Klint Finley, Writer for ReadWriteWeb. He has a BA from The Evergreen State College, where he
studied communications and mass media. He's the editor of the blogs Technoccult and
Mediapunk, and has been writing about technology and culture for over 10 years, “Do Private
Enterprises Need the NSA to Protect Them From Cyber Attacks?” ReadWrite, 7/8/10,
http://readwrite.com/2010/07/08/nsa-perfect-citizen // IS
The Wall Street Journal reports, citing unnamed sources, that the NSA is
launching a program to help protect critical infrastructure - including private
enterprises - from cyber attacks. According to the paper, defense contractor
Raytheon has received the contract for the project, which would rely on a series of
sensors to detect "unusual activity suggesting an impending cyber attack." This
follows the Lieberman-Collins bill passing committee in the Senate.
The Orwellian nature of the name was alledgedly not lost on Raytheon: The Wall Street Journal
claims to have seen an internal Raytheon e-mail saying "Perfect Citizen is Big Brother."
Although the project will be mostly aimed at public infrastructure with older
computer control systems, such as subways and air-traffic control systems, the
paper notes that the program could be of use to private companies like Google
and the many other companies who sustained cyber attacks late last year.
A power struggle has been brewing over who should be in charge of national cybersecurity: the
NSA, the Department of Homeland Security, the military, or the private sector. The LiebermanCollins bill would put the responsibility squarely on the shoulders of the White House and DHS,
establishing new cybersecurity organizations in both and requiring private enterprises to take
orders from the federal government in the event of a "national cyber emergency."
However, security expert Bruce Schneier notes that DHS has been
"getting hammered" in
Senate Homeland Security Committee hearings and that the NSA has been
consolidating its power. "Perfect Citizen" would appear to be a major power grab by the
agency.
Other countries are threats now – NSA is key to assessment
Tadjdeh, 15
Yasmin Tadjdeh, former editor-in-chief of VoxPop, Staff Writer at National Defense Magazine,
graduated with a B.A. in history from George Mason University, “NSA Chief: China, Russia
Capable of Carrying Out ‘Cyber Pearl Harbor’ Attack,” National Defense Magazine, 2/23/15,
http://www.nationaldefensemagazine.org/blog/Lists/Posts/Post.aspx?List=7c996cd7-cbb44018-baf8-8825eada7aa2&ID=1757 // IS
Nations such as China and Russia have enough offensive cyber capabilities to one
day carry out a “cyber Pearl Harbor” attack, said the head of the National
Security Agency and U.S. Cyber Command.
“We’ve talked about our concerns with China and what they’re doing in cyber.
Clearly the Russians and others have [those types of] capabilities,” said Navy Adm.
Mike Rogers on Feb. 23. “We’re mindful of that.”
A cyber Pearl Harbor could include an attack on critical infrastructure or the
financial sector , Rogers said during a cyber security forum sponsored by the New America
Foundation, a Washington, D.C.-based think tank.
“You’ve seen some [smaller events already]. You look at what happened at Sony,
you look at what we’ve seen nation states attempting to do against U.S. financial
websites for some years now,” Rogers said. There would be dire implications for the nation
if ordinary citizens were unable to access their bank accounts, he added.
In the Defense Department, there is great concern about intellectual property being stolen, he
noted.
“Certainly in the Department of Defense, it’s an issue that has been of great
concern to
us as for some time," he said. Nation states have penetrated some key defense
contractors, and stolen the enabling technology that gives the U.S. military an
operational advantage, he said.
Part of the NSA’s function is to keep tabs on potential threats to the United
States. In 2013, the agency came under fire after Edward Snowden, a government contractor at
the time, leaked classified information that revealed the agency was collecting enormous amounts
of phone metadata from U.S. citizens.
Rogers defended the program and said the bulk collection of data absolutely helps the
nation prevent attacks.
Cyber risks are high now – Iran – no treaties can check
Gady, 15
Franz-Stefan Gady is an Associate Editor with The Diplomat and a Senior Fellow with the
EastWest Institute, “Iran and the United States Locked in Cyber Combat,” The Diplomat, 2/27/15,
http://thediplomat.com/2015/02/iran-and-the-united-states-locked-in-cyber-combat/ // IS
This month the news website The Intercept revealed a new National Security Agency
document outlining the ongoing battle between Iran and the United States in
cyberspace. The memo, dated from April 2013, was prepared for then N.S.A. director and head
of U.S. Cyber Command General Keith B. Alexander and contains a number of talking points for
the general’s interaction with the head of Britain’s Government Communications Headquarters
(GCHQ) — the British equivalent to the American N.S.A.
Most importantly, the document outlines a cycle of
escalating cyberattacks and
counter-attacks, first initiated by the Israeli-American Stuxnet attack against Iranian
computers:
“Iran continues to conduct distributed denial-of-service (DDOS) attacks against
numerous U.S. financial institutions, and is currently in the third phase of a
series of such attacks that began in August 2012. SIGINT [signals intelligence] indicates that
these attacks are in retaliation to Western activities against Iran’s nuclear sector and that senior
officials in the Iranian government are aware of these attacks.”
The memo also outlines what can only be described as a cyber-arms race
between the two nations: “NSA expects Iran will continue this series of attacks,
which it views as successful, while striving for increased effectiveness by adapting
its tactics and techniques to circumvent victim mitigation attempts.”
Iranian hackers, the memo notes, have adapted quickly and learned from their
adversary as illustrated by one spectacular attack:
“Iran’s destructive cyberattack against Saudi Aramco in August 2012, during which data
was destroyed on tens of thousands of computers, was the first such attack NSA
has observed from this adversary. Iran, having been a victim of a similar cyberattack
against its own oil industry in April 2012, has demonstrated a clear ability to learn from
the capabilities and actions of others.”
The N.S.A. document further emphasizes that the cyber conflict between the two
countries is far from over and that the capabilities of both sides are ever
expanding: “We continually update contingency plans to reflect the changes in
both our access and Iran’s capabilities.”
After the publication of this document, The New York Times Editorial Board called for cyber arms
control treaties. “The best way forward is to accelerate international efforts to negotiate limits on
the cyberarms race, akin to the arms-control treaties of the Cold War. Barring that, there are few
viable ways to bring these new weapons and their use under control,” the board cautions.
Yet given the easy proliferation of cyber weapons — although sophisticated
cyberattacks such as Stuxnet require the backing of a nation state — a cyber-arms
control treaty may be illusionary at this stage. Perhaps another idea may be worth
considering.
Catastrophic cyberattacks are coming now
Erwin et al., 12
Sandra Erwin, Stew Magnuson, Dan Parsons and Yasmin Tadjdeh, *Editor of National Defense
Magazine, graduated from Coe College with a B.A. in Political Science, “Top Five Threats to
National Security in the Coming Decade: The Future of Cyber Wars,” National Defense Magazine,
November 2012,
http://www.nationaldefensemagazine.org/archive/2012/november/pages/topfivethreatstonation
alsecurityinthecomingdecade.aspx // IS
Army Gen. Keith Alexander, commander of U.S. Cyber Command, sees a day in
the
not too distant future when attacks on computer networks cross the line from
theft and disruption to “destruction.”
And this chaos will not all take place in the digital world of ones and zeroes. He is
referring to remote adversaries taking down infrastructure such as power
grids, dams, transportation systems and other sectors that use computerbased industrial controls.
The last decade has seen mostly exploitation by adversaries, or the theft of money
and intellectual property. Next came distributed denial of service attacks when
hackers overwhelm networks and disrupt operations of businesses or other
organizations, Alexander said at a recent Woodrow Wilson Center panel discussion on
cybersecurity.
Other than intercontinental ballistic missiles and acts of terrorism, an adversary
seeking to reach out and harm the United States has only one other option:
destructive cyber-attacks, Alexander said.
This could result in loss of life and damage to the economy on par with what
occurred after 9/11.
“All of that is within the realm of the possible,” he said. “I believe that is
coming our way . We have to be out in front of this,” Alexander said.
How to thwart such attacks is the problem the nation is facing.
Most of the Internet’s infrastructure through which malware is delivered is in the private sector’s
hands. So too are the banking, energy, transportation and other institutions that are vulnerable to
the attacks.
During the past year, there have been 200 attacks on core critical infrastructures in
the transportation, energy, and communication industries reported to the
Department of Homeland Security, said Sen. Susan Collins, R-Maine, and ranking member
of the Senate Homeland Security and Governmental Affairs Committee.
“And that is only the tip of the iceberg. Undoubtedly there are more that have not been reported,”
she said during the panel.
“In this case, the dots have already been
connected. The alarm has already been
sounded, and we know it is only a matter of when, not whether we have a
catastrophic attack ,” she said.
Alexander, Collins and others are advocating for a more coordinated national effort to share
information on cyberthreats. Cyber Command and the National Security Agency have loads of
expertise, but can’t always share classified information, or cross lines when it comes to the privacy
of U.S. citizens. The rest of the federal government has bits and pieces of information, and
different responsibilities. Private sector companies are sometimes reluctant to disclose attacks for
fear of upsetting shareholders or opening themselves up to lawsuits. Legislation co-sponsored by
Collins to help pave the way for better information sharing died in Congress last summer.
Despite having poured countless amounts of money into cybersecurity on both the federal and
private levels, there is still a lot to be learned about the threat, said one analyst.
“Why do we have a cyberwar community? Because we haven’t mastered cyber,” said Martin
Libicki, senior management scientist at Rand Corp. The main problem is that computers were
originally seen as something to be “tinkered” with, Libicki said.
“We’ve built the most important infrastructure on things that were made to be toys,” said Libicki.
Being toys, computer systems from the start have gaps and vulnerabilities needing to be patched.
“Every cyber-attack is a reflection of some vulnerability in the system,” said Libicki.
Greg Giaqunito, an analyst at Forecast International, said it’s not enough to defend the nation
from attacks, offensive capabilities are required.
Increasingly, the United States is taking more proactive measures against adversaries and
initiating activity, Giaqunito said.
“We are actually taking proactive action against other adversaries, so it’s not only protecting
ourselves, but the U.S. taking more proactive stances and actually initiating some activity against
U.S. adversaries,” he said.
Over the next few years, the hackers will become more sophisticated, said Charles
Croom, vice president of cyber security solutions at Lockheed Martin Information
Systems & Global Solutions. This doesn’t necessarily mean that the technologies
are becoming more advanced — even the most sophisticated threats often use
known vulnerabilities and malware, Croom said — but the adversaries have
become more effective.
AT: SIGINT CP
perm
Perm do both – reducing data collection is key to free up resources for
solutions
Tufekci, 15
Zeynep Tufekci, an assistant professor at the University of North Carolina, “Terror and the limits
of mass surveillance,” Financial Times: The Exchange, 2/3/15, http://blogs.ft.com/theexchange/2015/02/03/zeynep-tufekci-terror-and-the-limits-of-mass-surveillance/ // IS
Europe’s top human rights body, the Council of Europe, put out a report last week
blasting governmental mass surveillance, joining a long list of organisations and
individuals who have voiced strong moral and political objections to National Security Agencytype blanket surveillance. This latest report, like most such criticisms, misses a key
point: despite the common notion that we are trading safety for liberty by letting
governments scoop up so much of our data, the truth is that mass surveillance
and big data analytics are not the right tool for extremely rare events like acts
of terrorism in western societies.
The most common justification given by governments for mass surveillance is
that these tools are indispensable for fighting terrorism. The NSA’s ex-director Keith
Alexander says big data is “what it’s all about”. Intelligence agencies routinely claim that they
need massive amounts of data on all of us to catch the bad guys, like the French brothers who
assassinated the cartoonists of Charlie Hebdo, or the murderers of Lee Rigby, the British soldier
killed by two men who claimed the act was revenge for the UK’s involvement in the wars in Iraq
and Afghanistan.
But the assertion that big data is “what it’s all about” when it comes to predicting
rare events is not supported by what we know about how these methods work,
and more importantly, don’t work. Analytics on massive datasets can be
powerful in analysing and identifying broad patterns, or events that occur
regularly and frequently, but are singularly unsuited to finding unpredictable,
erratic, and rare needles in huge haystacks. In fact, the bigger the haystack — the
more massive the scale and the wider the scope of the surveillance — the less
suited these methods are to finding such exceptional events, and the more they may
serve to direct resources and attention away from appropriate tools and methods.
squo solves
Current programs check analysts
SIDtoday, 6
Signals Intelligence Directorate Today, “Dealing with a ‘Tsunami’ of Intercept,” 8/29/06,
https://s3.amazonaws.com/s3.documentcloud.org/documents/2088984/tsunami-ofintercept.pdf // IS
(S//SI) Everyone knows that analysts have been drowning in a tsunami of intercept
whose volume, velocity and variety can be overwhelming. But the Human Language
Technology Program Management Office (HLT PMO) can predict that in the very near future the
speed and volume of SIGINT will increase even more, almost beyond imagination. And we are
working on ways to help analysts deal with it all.
(S//SI) Of the HLT PMO's five Strategic Thrusts, the one that addresses this
problem is High Speed/ High Volume. It must deal with today's collection and must plan
for tomorrow's. The current collection environment is characterized by huge amounts of data,
coupled with severely limited capability to send material forward, and extremely limited number
of queries that exactly describe messages of value. That means we are capable of finding huge
amounts of data, much of which is not what we really want, and that we cannot send it all back for
analyst processing.
(TS//SI) To plan for tomorrow, High Speed/
High Volume is in line with changes in
the overall NSA/CSS systems, particularly TURBULENCE and TURMOIL
because when they become a reality in the near future, we can expect collection
capabilities to increase significantly. TURBULENCE is an umbrella cover term
describing the next generation mission environment that will create a unified
system. TURMOIL is a passive filtering and collection effort on high-speed
networks. This is designed to be flexible and can be modified quickly to deliver
data in analyst- ready form.
(S//SI) One of High Speed/ High Volume's first efforts is in developing and
implementing ways to push HLT capabilities very dose to the collection points of
the SIGINT system. In particular, HLT is about to demonstrate an operational
prototype of language identification for Special Source Operations (SSO)
Counterterrorism text targets running at line speeds (STM-16) at the packet- level.
Resources permitting, HLT analytic processors will automatically generate content-based events
for TURMOIL based on language.
CP links to PTX
NSA action links to politics
Poitras and Risen, 13
Laura Poitras and James Risen, *Academy Award-winning American documentary film director
and producer who produced Citizenfour, a documentary on Snowden, **reporter for the New
York Times, “N.S.A. Report Outlined Goals for More Power,” The New York Times, 11/23/13,
http://www.nytimes.com/2013/11/23/us/politics/nsa-report-outlined-goals-for-morepower.html // IS
Prompted by a public outcry over the N.S.A.’s domestic operations, the agency’s
critics in Congress have been pushing to limit, rather than expand, its ability to
routinely collect the phone and email records of millions of Americans, while
foreign leaders have protested reports of virtually unlimited N.S.A. surveillance overseas, even in
allied nations. Several inquiries are underway in Washington; Gen. Keith B. Alexander, the
N.S.A.’s longest-serving director, has announced plans to retire; and the White House has offered
proposals to disclose more information about the agency’s domestic surveillance activities.
The N.S.A. document, titled “Sigint Strategy 2012-2016,” does not make clear
what legal or policy changes the agency might seek. The N.S.A.’s powers are
determined variously by Congress, executive orders and the nation’s secret
intelligence court, and its operations are governed by layers of regulations. While asserting
that the agency’s “culture of compliance” would not be compromised, N.S.A. officials argued that
they needed more flexibility, according to the paper.
NSA reform unpopular – Freedom act magnifies the link
Gross, 6/6
Grant Gross, Washington, D.C., correspondent for IDG News Service “Don't expect major changes
to NSA surveillance from Congress,” CIO, 6/6/15, http://www.cio.com.au/article/576813/don-texpect-major-changes-nsa-surveillance-from-congress/ // IS
The Senate this week passed the USA Freedom Act, which aims to end the NSA's mass
collection of domestic phone records, and President Barack Obama signed the bill hours later.
After that action, expect Republican leaders in both the Senate and the House of
Representatives to resist further calls for surveillance reform. That resistance is
at odds with many rank-and-file lawmakers, including many House Republicans, who
want to further limit NSA programs brought to light by former agency contractor Edward
Snowden.
Civil liberties groups and privacy advocates also promise to push for more
changes. It may be difficult to get "broad, sweeping reform" through Congress, but many
lawmakers seem ready to push for more changes, said Adam Eisgrau, managing director of the
office of government relations for the American Library Association. The ALA has charged the
NSA surveillance programs violate the Fourth Amendment of the U.S. Constitution, which
prohibits unreasonable searches and seizures.
"Congress is not allowed to be tired of surveillance reform unless it's prepared to say it's tired of
the Fourth Amendment," Eisgrau said. "The American public will not accept that."
Other activists are less optimistic about more congressional action. "It will a
long slog getting more restraints," J. Kirk Wiebe, a former NSA analyst and
whistleblower said by email. "The length of that journey will depend on public outcry -- that is the
one thing that is hard to gauge."
NSA reform is incredibly unpopular
Ferestein, 14
Gregory Ferenstein, Editor at Ferenstein Wire, graduate of UC-Irvine, “Congress will probably fail
to pass NSA reform, in pursuit of perfection — and Presidential politics,” 11/16/14,
http://venturebeat.com/2014/11/16/congress-will-fail-to-pass-nsa-reform-in-pursuit-ofperfection-and-presidential-politics/ // IS
Congress has yet to enact any surveillance reform — more than a year after
Edward Snowden revealed the National Security Agency’s mass spying program.
As the new Congress convenes for 2015, it appears that the familiar squabbles will prevent
meaningful reform yet again. The biggest hope for reform at the moment is the USA
Freedom Act, which would severely limit the bulk collection of Internet and phone data. It would
also appoint a special civilian defense attorney to the court that approves spying requests (among
many other reforms). However, influential Senators are already taking sides against the
USA Freedom Act. Hawkish members serving on the Senate intelligence committee
have threatened to block any bill that significantly curtails the NSA’s spying powers. Sen.
Saxby Chambliss (R.-Georgia) called the upcoming USA Freedom Act “terrible”
and said he would filibuster any attempt to pass the bill. “It destroys our ability to
fight domestic terrorism in particular, and we’re going to hopefully be able to
avoid having that bill come to the floor,” Chambliss said. On the flip side, early
presidential front-runner Senator Rand Paul (R.-Kentucky) will reportedly
oppose the USA Freedom Act because it doesn’t go far enough in restricting the
NSA. Speaking about the bill, an aid for Paul told CNN that “Due to significant problems with the
bill, at this point he will oppose the Leahy bill.” (Sen. Patrick Leahy, D.-Vermont, is a sponsor of
the USA Freedom bill.) One of the unique and ultimately self-defeating aspects of
NSA reform is that it isn’t split along traditional party lines. There’s internal
dissent within both the Republican and Democratic parties. Even though
Republicans will control both the House and Senate as of 2015, there’s no
agreement within the leadership of the party on surveillance reform.
Anti-surveillance advocates are emboldened now
Alexis, 15
Alexei Alexis, Bloomberg reporter, “Pressure for Surveillance Overhaul To Continue, Despite New
Curbs on NSA, Bloomberg,” 6/8/15 http://www.bna.com/pressure-surveillance-overhauln17179927445/ // IS
June 3 — Emboldened by passage of the USA FREEDOM Act (H.R. 2048),
technology groups and privacy advocates plan to continue pushing for
surveillance changes that were excluded from the new law.
The USA FREEDOM Act, cleared by Congress June 2 and signed by President Barack Obama the
same day, overhauls National Security Agency surveillance activities exposed by Edward
Snowden.
“This is the most significant surveillance reform measure in the last generation,”
Harley Geiger, deputy director on surveillance and security for the Center for
Democracy & Technology, told Bloomberg BNA June 3. “It's definitely a great
first step, but it still doesn't solve many problems .”
solvency deficits
Private companies fail – translation
Mercado, 7
Stephen Mercado, an analyst in the CIA Directorate of Science and Technology “Reexamining the
Distinction Between Open Information and Secrets,” CIA, 4/15/07,
https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/csistudies/studies/Vol49no2/reexamining_the_distinction_3.htm // IS
Policymakers and intelligence executives would also do well to resist the siren call
of those who argue that we should simply privatize OSINT. Private corporations
are an excellent source of dictionaries, software, and contractors for our
government. But private companies alone are no substitute for accountable,
dedicated OSINT professionals in government offices.[12] Let us take the vital
issue of translation as an example. Contractors— whether individuals, translation
agencies, or research companies (the latter generally subcontracting with translation
agencies or independent translators for the talent they lack in house)—today translate most
of the foreign newspapers, scientific journals, and other open information for the
Intelligence Community. They do so under the lead of cleared OSINT officers
who, knowing both the requirements of the Intelligence Community and the mysteries of the
foreign media, manage the translation flows to provide answers to intelligence
questions. Staff officers are also available to translate priority items themselves on a crash basis
when contractors are unavailable. Staff officers serve one master. Contractors, busy with a
mix of assignments from corporate and government customers, often are
unavailable when most needed.
Ideally, in my view, the government should develop its own sizeable cadre of translators. Yet, that
would be much more expensive than the present system. Some would argue for the opposite path
of privatizing OSINT, which would mean intelligence analysts, case officers, and others buying
their translations directly from the private sector without OSINT officers to apply their general
requests against the appropriate media for the right information or to edit, often heavily,
contractor translations that are frequently of poor quality.
SIGINT privatization empirically fails and gets leaked
Shorrock, 13
Tim Shorrock, has appeared in many publications, including Salon, Mother Jones, The
Progressive, The Daily Beast and the New York Times, “Obama's Crackdown on Whistleblowers,”
The Nation, 3/26/13, http://www.thenation.com/article/173521/obamas-crackdownwhistleblowers?page=full // IS
The hypocrisy is best illustrated in the case of four whistleblowers from the National
Security Agency: Thomas Drake, William Binney, J. Kirk Wiebe and Edward
Loomis. Falsely accused of leaking in 2007, they have endured years of legal harassment
for exposing the waste and fraud behind a multibillion-dollar contract for a
system called Trailblazer, which was supposed to “revolutionize” the way the NSA
produced signals intelligence (SIGINT) in the digital age. Instead, it was canceled
in 2006 and remains one of the worst failures in US intelligence history. But the
money spent on this privatization scheme, like so much at the NSA, remains a
state secret.
AT: Yahoo CP
Yahoo doesn’t matter
Yahoo is irrelevant and declining
O’Brien, 15
Mike O’Brien, Master's degree in journalism from the University of Oregon. “Yahoo's Search
Share on the Decline,” Search Engine Watch, 3/19/15,
http://searchenginewatch.com/sew/news/2400580/yahoos-search-share-on-the-decline // IS
Yahoo's recent increase in the U.S. desktop search share has started to falter,
according to comScore's monthly qSearch analysis. In December, Yahoo replaced
Google as the default search engine on Mozilla Firefox. That month, the first in
the five-year partnership, Yahoo saw its highest share - 11.8 percent - of the
desktop search market since 2009. The number increased to 13 percent in
January, before starting to decline. In February, Yahoo's share went down to 12.8
percent, with those 0.2 percentage points dividing between Google and Microsoft.
Google continues to have the lion's share of the search market; even with Yahoo's comeback,
Google never dipped below 64 percent.
Yahoo doesn’t matter
Lynch, 14
Jim Lynch, former community manager for ZiffNet, ZDNet, PCMag, ExtremeTech, The
FamilyEducation Network, and MSN Games.“The decline and fall of the Yahoo empire,”12/23/14
http://jimlynch.com/internet/the-decline-and-fall-of-the-yahoo-empire/ // IS
The arrival of Marissa Mayer from Google was supposed to resurrect Yahoo, but
it seems that her efforts have mostly failed. The New York Times magazine has a deep
and detailed portrait of what went wrong at Yahoo with Mayer as CEO of Yahoo via an excerpt of
an upcoming book by Nicholas Carlson called “:
In many ways, Yahoo’s decline from a $128 billion company to one worth virtually
nothing is entirely natural. Yahoo grew into a colossus by solving a problem that
no longer exists. And while Yahoo’s products have undeniably improved, and its
culture has become more innovative, it’s unlikely that Mayer can reverse an
inevitability unless she creates the next iPod. All breakthrough companies,
after all, will eventually plateau and then decline. U.S. Steel was the first billiondollar company in 1901, but it was worth about the same in 1991. Kodak , which once employed
nearly 80,000 people, now has a market value below $1 billion. Packard and Hudson
ruled the roads for more than 40 years before disappearing. These companies matured and
receded over the course of generations, in some cases even a century. Yahoo went
through the process in 20 years. In the technology industry, things move fast.
Yahoo is merging and might even shut down
Lynch, 14
Jim Lynch, former community manager for ZiffNet, ZDNet, PCMag, ExtremeTech, The
FamilyEducation Network, and MSN Games.“The decline and fall of the Yahoo empire,”12/23/14
http://jimlynch.com/internet/the-decline-and-fall-of-the-yahoo-empire/ // IS
So what now for Yahoo? According to the NY Times article it seems that
Yahoo’s days are numbered as an independent company. A merger with
AOL seems to be in the works, and it that happens there will no doubt be a lot of
layoffs at Yahoo as the combined company seeks to cut costs by restructuring. At
this point I doubt there’s much that Marissa Mayer or anyone else can do to stop
this from happening if AOL decides to pursue the merger.
It makes me wonder if, in a few years down the road, we’ll see a story about how
AOL has decided to pull the plug on Yahoo altogether and shut it down . What a
very sad fate that would be for a former giant of the Internet like Yahoo.
AT: Snowball CP
solvency deficit
CP is not enough to solve snowball
Carley and Tsvetovat, 6
Kathleen Carley and Maksim Tsvetovat, *professor in the School of Computer Science in the
department - Institute for Software Research - at Carnegie Mellon University, **Ph.D. from
Carnegie Mellon University's School of Computer Science, with concentration on computational
modeling of organizations, “On effectiveness of wiretap programs in mapping social networks,”
8/30/06, Springer Science + Business Media, LLC, Proquest // is amp
Our experiments show that it is possible to obtain high-quality data on covert
networks without using random traffic sampling (e.g. Echelon) or snowball
sampling, both of which capture too much unnecessary data, and do not make
good use of the data that has been already captured. As an alternative, use of
optimization-based and socially intelligent sampling techniques allows for a
tightly targeted and resource-thrifty SIGINT gathering program. Not only these
techniques are significantly less costly, they also produce better overall
intelligence data with a closer match to the covert network being studied. The
annealing-based information capture is not limited to sampling communications within covert
networks, but rather is a flexible methodology for surveying networks of hidden populations. As
the optimization-based sampling techniques do not require as much information
capture as snowball sampling, they will present less of a resource strain on data
collectors and are a more cost-efficient way to sample communication for analysis of social
networks.
A2: Splunk CP
Squo solves
The government already uses Splunk
Splunk, 15
Splunk, “Defense and Intelligence Agencies,” Spunk’s Website, 2015,
http://www.splunk.com/en_us/solutions/industries/public-sector/defense-and-intelligenceagencies.html // IS
Government defense and intelligence agencies are tasked with collecting,
analyzing and storing massive amounts of data to detect and correlate patterns of
activity related to security threats. They also need systems that can handle extremely
granular role-based access controls (RBAC) so that only those that ‘need-to-know’ have access to
Splunk® Enterprise, the industry standard product
for big data analysis is widely deployed across hundreds of government
agencies to help pass their FISMA assessments and: Detect patterns and anomalies
the right data at the right time.
across terabytes of raw data in real time without specialized skills, up front data normalizations or
fixed schemas Can use Hunk®: Splunk Analytics for Hadoop and NoSQL data stores to provide a
unified view or your data Automatically monitor for NIST 800-53 controls supporting the 800-37
risk management framework Support continuous monitoring and the acquisition of context data
from any event from any layer of the IT structure
A2: UQ CP
Links to ptx
Congress keeps a close watch
Best, 1
(Richard Jr, Specialist in National Defense, Foreign Affairs, Defense, and Trade Division, CRS
Report for Congress, “The National Security Agency: Issues for Congress”,
http://fas.org/irp/crs/RL30740.pdf, amp)
NSA’s efforts to adjust to the changing geopolitical and technological environment have
been strongly encouraged by Congress and reflect a major shift in congressional
oversight of the Agency. Although Congress has always approved funding for NSA,
for decades routine oversight was limited to a few Members and staff. In the 1970s,
congressional investigations of intelligence agencies resulted in greater public
attention to NSA and criticism of activities that infringed on the civil liberties of U.S.
persons. Subsequently, both the Senate and the House of Representatives established
intelligence oversight committees that have closely monitored NSA’s operations. The
Foreign Intelligence Surveillance Act (FISA) was enacted in 1978 to regulate collection by
foreign intelligence agencies of the communications of U.S. persons. The end of the Cold War, the
expansion of lowcost encryption and the explosion of communications systems led Congress to
take a more public profile in overseeing the large and secretive Agency.
A2: K
note
All the regular answers to policy overload are responsive
2ac NSA failure bad
The counter-advocacy causes the NSA to fail – that’s what makes the
public want to fund it even more – every time a terrorist attack gets
through it becomes a justification for defense spending – this results
in endless expansion of the surveillance state as more information
causes an overload that guarantees even more failure
Engelhardt 14 – co-founder of the American Empire Project, Teaching Fellow at the
Graduate School of Journalism at the University of California, Berkeley (Tom, “Failure Is Success:
How US Intelligence Works in the 21st Century,” Truthout, 9/30/2014, http://www.truthout.org/opinion/item/26529-failure-is-success-how-us-intelligence-works-in-the-21st-century)
//RGP
Whatever the case, while
taxpayer dollars flowed into your coffers, no one considered it
a problem that the country lacked 17 overlapping outfits bent on preventing
approximately 400,000 deaths by firearms in the same years; nor 17 interlocked
agencies dedicated to safety on our roads, where more than 450,000 Americans
have died since 9/11. (An American, it has been calculated, is 1,904 times more likely to die in a car accident
than in a terrorist attack.) Almost all the money and effort have instead been focused on the microscopic number of
terrorist plots -- some spurred on by FBI plants -- that have occurred on American soil in that period. On the conviction
that Americans must be shielded from them above all else and on the fear that 9/11 bred in this country, you’ve
built
an intelligence structure unlike any other on the planet when it comes to size,
reach, and labyrinthine complexity. It’s quite an achievement, especially when you consider its one
downside: it has a terrible record of getting anything right in a timely way. Never have so many had access to so much
information about our world and yet been so unprepared for whatever happens in it. When it comes to getting ahead of
the latest developments on the planet, the ones that might really mean something to the government it theoretically
serves, the IC is -- as best we can tell from the record it largely prefers to hide -- almost always behind the 8-ball. It seems
to have been caught off guard regularly enough to defy any imaginable odds. Think about it, and think hard. Since
9/11 (which might be considered the intelligence equivalent of original sin when it comes to missing the mark), what
exactly are the triumphs of a system the likes of which the world has never seen before? One and only one
event is sure to come immediately to mind: the tracking down and killing of
Osama bin Laden. (Hey, Hollywood promptly made a movie out of it!) Though he was by then essentially a
toothless figurehead, an icon of jihadism and little else, the raid that killed him is the single obvious triumph of these
years. Otherwise, globally from the Egyptian spring and the Syrian disaster to the crisis in Ukraine, American intelligence
has, as far as we can tell, regularly been one step late and one assessment short, when not simply blindsided by events. As
a result, the Obama administration often seems in a state of eternal surprise at developments across the globe. Leaving
aside the issue of intelligence failures in the death of an American ambassador in Benghazi, for instance, is there any
indication that the IC offered President Obama a warning on Libya before he decided to intervene and topple that
country’s autocrat, Muammar Gaddafi, in 2011? What we know is that he was told, incorrectly it seems, that there would
be a “bloodbath,” possibly amounting to a genocidal act, if Gaddafi's troops reached the city of Benghazi. Might an agency
briefer have suggested what any reading of the results of America's twenty-first century military actions across the Greater
Middle East would have taught an observant analyst with no access to inside information: that the fragmentation of
Libyan society, the growth of Islamic militancy (as elsewhere in the region), and chaos would likely follow? We have to
assume not, though today the catastrophe of Libya and the destabilization of a far wider region of Africa is obvious. Let’s
focus for a moment, however, on a case where more is known. I’m thinking of the development that only recently riveted
the Obama administration and sent it tumbling into America’s third Iraq war, causing literal hysteria in Washington.
Since June, the most successful terror group in history has emerged full blown in Syria and Iraq, amid a surge in jihadi
recruitment across the Greater Middle East and Africa. The Islamic State (IS), an offshoot of al-Qaeda in Iraq, which
sprang to life during the U.S. occupation of that country, has set up a mini-state, a “caliphate,” in the heart of the Middle
East. Part of the territory it captured was, of course, in the very country the U.S. garrisoned and occupied for eight years,
in which it had assumedly developed countless sources of information and recruited agents of all sorts. And yet, by all
accounts, when IS’s militants suddenly swept across northern Iraq, the CIA in particular found itself high and dry. The IC
seems not to have predicted the group’s rapid growth or spread; nor, though there was at least some prior knowledge of
the decline of the Iraqi army, did anyone imagine that such an American created, trained, and armed force would so
summarily collapse. Unforeseen was the way its officers would desert their troops who would, in turn, shed their uniforms
and flee Iraq’s major northern cities, abandoning all their American equipment to Islamic State militants. Nor could the
intelligence community even settle on a basic figure for how many of those militants there were. In fact, in part because IS
assiduously uses couriers for its messaging instead of cell phones and emails, until a chance arrest of a key militant in
June, the CIA and the rest of the IC evidently knew next to nothing about the group or its leadership, had no serious
assessment of its strength and goals, nor any expectation that it would sweep through and take most of Sunni Iraq. And
that should be passing strange. After all, it now turns out that much of the future leadership of IS had spent time together
in the U.S. military’s Camp Bucca prison just years earlier. All you have to do is follow the surprised comments of various
top administration officials, including the president, as ISIS made its mark and declared its caliphate, to grasp just how illprepared 17 agencies and $68 billion can leave you when your world turns upside down. Producing Subprime Intelligence
as a Way of Life In some way, the
remarkable NSA revelations of Edward Snowden may
have skewed our view of American intelligence. The question, after all, isn’t simply: Who did they
listen in on or surveil or gather communications from? It’s also: What did they find out? What did they
draw from the mountains of information, the billions of bits of intelligence data
that they were collecting from individual countries monthly (Iran, 14 billion; Pakistan, 13.5
billion; Jordan, 12.7 billion, etc.)? What was their “intelligence”? And the answer seems to be that, thanks to the
mind-boggling number of outfits doing America’s intelligence work and the
yottabytes of data they sweep up, the IC is a morass of information overload,
data flooding, and collective blindness as to how our world works. You might
say that the American intelligence services encourage the idea that the world is only
knowable in an atmosphere of big data and a penumbra of secrecy. As it happens, an open and openminded assessment of the planet and its dangers would undoubtedly tell any government so much more. In that sense,
the system bolstered and elaborated since 9/11 seems as close to worthless in terms of
bang for the buck as any you could imagine. Which means, in turn, that we outsiders should view with a jaundiced eye the
latest fear-filled estimates and overblown "predictions" from the IC that, as now with the tiny (possibly fictional) terror
group Khorasan, regularly fill our media with nightmarish images of American destruction. If
the IC’s post-9/11
effectiveness were being assessed on a corporate model, it’s hard not to believe
that at least 15 of the agencies and outfits in its “community” would simply be axed and the
other two downsized. (If the Republicans in Congress came across this kind of institutional tangle and record of failure in
domestic civilian agencies, they would go after it with a meat cleaver.) I suspect that the government could learn far more
about this planet by anteing up some modest sum to hire a group of savvy observers using only open-source information.
For an absolute pittance, they
would undoubtedly get a distinctly more actionable vision
of how our world functions and its possible dangers to Americans. But of course we’ll
never know. Instead, whatever clever analysts, spooks, and operatives exist in the maze of America’s spy and surveillance
the overall system produces vast reams of
subprime intelligence . Clearly, having a labyrinth of 17 overlapping,
paramilitarized, deeply secretive agencies doing versions of the same thing is the
definition of counterproductive madness. Not surprisingly, the one thing the U.S.
intelligence community has resembled in these years is the U.S. military, which
since 9/11 has failed to win a war or accomplish more or less anything it set out to do. On the other hand,
networks will surely remain buried there, while
all of the above assumes that the purpose of the IC is primarily to produce successful “intelligence” that leaves the White
House a step ahead of the rest of the world. What
if, however, it's actually a system organized on
the basis of failure? What if any work-product disaster is for the IC another kind
of win. Perhaps it's worth thinking of those overlapping agencies as a fiendishly clever Rube
Goldberg-style machine organized around the principle that failure is the greatest
success of all. After all, in the system as it presently exists, every failure of intelligence is just
another indication that more security, more secrecy, more surveillance, more
spies, more drones are needed; only when you fail , that is, do you get more
money for further expansion. Keep in mind that the twenty-first-century version of
intelligence began amid a catastrophic failure: much crucial information about the 9/11 hijackers
and hijackings was ignored or simply lost in the labyrinth. That failure, of course, led
to one of the great
intelligence expansions, or even explosions, in history. (And mind you, no figure in authority in the national
security world was axed, demoted, or penalized in any way for 9/11 and a number of them were later given awards and
promoted.) However
they may fail, when it comes to their budgets, their power, their
reach, their secrecy, their careers, and their staying power, they have succeeded
impressively.
Aff Misc
Serial Policy Failure
Banking on tech advances to solve overload causes serial policy
failure
Woods, et al, 2
(D.D. Woods, Emily Patterson, Emilie Roth, Professors, Cognitive Systems Engineering
Laboratory, Institute for Ergonomics, Cognition, Technology and Work, April 2002, Volume 4,
Issue 1, pp 22-36, “Can We Ever Escape From Data Overload? A Cognitive Systems Diagnosis”,
http://link.springer.com/article/10.1007/s101110200002, amp)
Each round of technical advances, whether in artificial intelligence, computer graphics or
electronic connectivity, promises to help people better understand and manage a whole host of
activities, from financial analysis to monitoring data from space missions to controlling the national air
space. Certainly, this ubiquitous computerisation of the modern world has tremendously
advanced our ability to collect, transmit and transform data, producing unprecedented
levels of access to data.
However, our ability to interpret this avalanche of data, i.e., to extract meaning
from artificial fields of data, has expanded much more slowly, if at all. In studies across
multiple settings, we find that practitioners are bombarded with computer-processed data,
especially when anomalies occur. We find users lost in massive networks of computerbased
displays, options and modes. For example, one can find a version of the following statement
in most accident investigation reports: although all of the necessary data was
physically available, it was not operationally effective. No one could assemble the
separate bits of data to see what was going on. (Joyce and Lapinski 1983) The challenge has become
finding what is informative given our interests and needs in a very large field of available data.
The paper is organised as follows. To set the stage, we characterise how technology
change has created a
paradoxical situation and, we introduce people as a model of competence through a historical
example. From this base we summarise the three different major characterisations of the data overload problem. We
then provide a ‘diagnosis’ of what makes data overload a difficult problem based on a synthesis of results from past studies
that examine how new computerised devices can help overcome or can exacerbate data overload-related problems in
control centres such as mission control for space shuttle operations, highly automated aviation flight decks, computerised
emergency operations control centres in nuclear power plants and surgical anaesthetic management systems in operating
rooms. Given this background, we can see how the typical solutions to the data overload problem avoid confronting the
heart of the matter directly, remaining content to nibble away at the edges through indirect means. Finally, we outline a
direction for progress towards more effective solutions to data overload relying on people as a competence model.
1.1. The Data Availability Paradox
Our situation seems paradoxical: more and more data is available in principle, but our
ability to interpret what is available has not increased. On one hand, all participants in a field of
activity recognise that having greater access to data is a benefit in principle. On the other hand, these same participants
recognise how the
flood of available data challenges their ability to find what is
informative or meaningful for their goals and tasks (Miller 1960). We will refer to this as the data availability
paradox. Data availability is paradoxical because of the simultaneous juxtaposition of our success and our vulnerability.
Technological change grows our ability to make data readily and more directly
accessible – the success, and, at the same time and for the same reasons, the change increasingly and
dramatically challenges our ability to make sense of the data available – the vulnerability.
1.2. ‘ A
Little More Technology Will Be Enough’
Criando difficuldades para vender facilidades [creating difficulties to sell solutions]. (Common Brazilian saying)
As the powers of technology explode around us, developers imagine potential benefits and
charge ahead in pursuit of the next technological advance. The claim is that data
overload and other problems will be solved by significant advances in machine ‘information’
processing, i.e., the technology for creating sophisticated graphics, for connecting distant people together and for
creating intelligent software agents.
However, after
each round of development, field researchers continue to observe
beleaguered practitioners actively trying to cope with data overload in one form or another.
This is a fundamental finding, repeatedly noted in many fields of practice and with many kinds of technology (e.g., Woods
1995a; Woods and Patterson 2000). When viewed in context, systems, developed putatively to
aid users, often
turn out to create new workload burdens when practitioners are busiest, new attentional
demands when practitioners are plagued by multiple channels/voices competing for
their attention, and new sources of data when practitioners are overwhelmed by
too many channels spewing out too much ‘raw’ data (Woods et al 1994, Ch. 5).
In practice, new
rounds of technology development become yet another voice in the data
cacophony around us. Ironically, the major impact has been to expand the problem
beyond specialised technical fields of activity (an aircraft cockpit or power plant control room) to
broader areas of activity (web-based activities we engage in everyday).
Academic studies prove excessive information hurts decisionmaking
Metzger, 5
(Michael, Chair in Business Ethics, Kelley School of Business, Indiana University, December
2005, University of Florida Journal of Law & Public Policy, 16 U. Fla. J.L. & Pub. Pol'y 435,
“ARTICLE: BRIDGING THE GAPS: COGNITIVE CONSTRAINTS ON CORPORATE CONTROL &
ETHICS EDUCATION”, lexis, amp)
From a thinking quality perspective, error can be introduced in every phase of the human thought process.
Much of the critical action in decisionmaking of all sorts occurs in the problem identification and classification stages. n89
Do we even admit that we have a problem, n90 and if [*456] we do, how do we classify it and the people involved? n91
Individual desires n92 and bias n93 can play a pivotal role in both problem identification and problem framing. n94 Even
things such as the language we use to describe the problem can have a powerful impact on our final decision. n95 [*457]
If our attention is selective, n96 then we may not register all of the relevant data
available in our external environment. n97 If our memories are also selective and frequently inaccurate,
n98 this can be expected to have a consequent negative effect on our ability to bring our past experience to bear in solving
current problems. n99 Further, our desire to maintain our self- esteem n100 may lead us to accept dubious arguments and
data, n101 to reject compelling arguments and data, n102 and to persist in behaviors and strategies long after an objective
observer would have concluded that they were ineffective. n103
Even in circumstances where our memories are accurate, our initial perceptions are complete, and our egos are in check,
other threats to gooddecisionmaking exist. In the heuristic phase of our reasoning process, [*458] preconscious mental
programs called heuristics n104 retrieve relevant information from memory and identify which bits of externally available
information are relevant, and therefore subject to further processing. n105 Heuristics are necessary parts of human
cognition for at least two reasons. First, they are essential elements in maintaining some semblance of cognitive economy.
We each have limited processing power, and thinking shortcuts n106 that require no conscious
thought can be a very efficient way to process information. n107 Second, without
some device to screen
information, we would quickly suffer cognitive overload from the millions of bits
of information embedded in our consciousness and in our environment. n108
[*459]
But while heuristics are essential and may work well most of the time, they can sometimes result in bias n109 in our
reasoning. n110 So, the information
that seems "relevant" to us psychologically may not
necessarily be what is relevant logically. n111 If this happens, no amount of good
reasoning in the conscious, analytical phase of our decisionmaking is likely to
lead to a correct solution. Why? Good reasoning based on bad information is
unlikely to lead to good conclusions. n112 Information can be "bad" if it is [*460]
inaccurate or incomplete, but it can also be "bad" if it is complete and completely accurate, but not
really germane to the thinking task at hand.
Even if our information is accurate and complete, and our selection of it is free of bias,
we may nonetheless make thinking mistakes during the conscious, analytical
phase of the thought process if our grasp of logic, basic probability principles, or statistics
is poor, or if certain aspects of the problem prevent us from bringing our full
reasoning powers to bear on it. n113 Because it is conscious, and because everyone can learn to improve
her logical reasoning ability and improve her understanding of basic statistical principles, n114 the analytical phase of the
reasoning process should be the phase that is most amenable to improvement.
The heuristic phase is, by definition, more problematic because it is unconscious. n115 As a leading authority on bias puts
it, "the representational [*461] heuristics responsible for many biases constitute preconscious processes. Subjects are
aware of that to which they are attending but not of the selective process directing their attention." n116
Infoglut
We are in an era of information overload – a period characterized by
an intractable paradox – while we have access to seemingly infinite
amounts of information, we are simultaneously unable to process it –
the attempt to become omniscient is counterproductive in that it
makes it harder to understand and trust information sources
Andrejevic 13 – Honorary Research Associate Professor, Centre for Critical and Cultural
Studies; media scholar who writes about surveillance, new media, and popular culture (Mark,
“InfoGlut,” Data Overload) //RGP
After a two-year investigation into the post-9/11 intelligence industry, the
Washington Post revealed that a sprawling array of public and private agencies
was collecting more information than anyone could possibly comprehend. As the
newspaper’s report put it, “Every day, collection systems at the National Security Agency
intercept and store 1.7 billion e-mails, phone calls and other types of
communications. The NSA sorts a fraction of those into 70 separate databases.”1 The NSA is merely
one amongst hundreds of agencies and contractors vacuuming up data to be sifted,
sorted, and stored. The resulting flood of information is, in part, a function of the
technological developments that have made it possible to automatically collect, store, and share fantastic
amounts of data. However, making sense of this information trove at the all-too-human
receiving end can pose a problem: “Analysts who make sense of documents and conversations obtained
by foreign and domestic spying share their judgment by publishing 50,000 intelligence reports each year – a volume so
large that many are routinely ignored.”2 The
so-called “Super Users” who are supposed to have
access to the whole range of information generated by the intelligence apparatus
reportedly told the Post that “there is simply no way they can keep up with the
nation’s most sensitive work.”3 As one of them put it, “I’m not going to live long enough to be briefed on
everything.”4 The lament is a familiar one in an era of information overload – and
not just for intelligence agencies, marketers, and other collectors of databases. The same challenge is faced by any citizen
attempting to read all of the news stories (or Tweets, or status updates, or blogs posts) that are published on a given day,
or a financial analyst researching all of the available information pertaining to the performance of a particular company.
When I was a journalist in the early 1990s, just as computers entered the newsroom, we had available to us several
electronic newswires that updated themselves automatically with stories on topics ranging from international news to US
politics to sports and entertainment. I remember thinking at the time that it was impossible to keep up with the news as it
unfolded on my screen. By the time I had read one wire story, dozens of new ones had been filed from around the world.
That was just a tiny taste of the coming information cornucopia. Now an unimaginably unmanageable flow of mediated
information is available to anyone with Internet access. The
paradox of an era of information glut
emerges against the background of this new information landscape: at the very
moment when we have the technology available to inform ourselves as never
before, we are simultaneously and compellingly confronted with the impossibility
of ever being fully informed. Even more disturbingly, we are confronted with this
impossibility at the very moment when we are told that being informed is more
important than ever before to our livelihood, our security, and our social lives. This is not to suggest that it
might, once upon a time, have been possible to be “fully informed”– n the sense of knowing all the details of the daily
events, their various causes, explanations, and interpretations relating to our social, cultural, political, and economic lives.
As Jorge Luis Borges’s (insomnia-inspired) allegory of the mnemonic phenomenon Funes suggests, every
day we
are bombarded with more information than we can possibly absorb or recall. The
ability to capture and recount all of this information in detail is precisely what
made Funes a freak – or a god: “We, at one glance, can perceive three glasses on a table; Funes, all the
leaves and tendrils and fruit that make up a grape vine. He knew by heart the forms of the southern clouds at dawn on the
30th of April, 1882, and could compare them in his memory with the mottled streaks on a book in Spanish binding he had
only seen once and with the outlines of the form raised by an oar in the Rio Negro the night before the Quebracho
uprising.”5 There are, of course, some drawbacks to total information awareness, Funes-style: it took him a full day to
remember a day (and presumably even longer to recall the day spent remembering it). Moreover, Funes was only
recording his direct experiences – as yet un-augmented by the Internet and its bottomless reserves of mediated
information. If
it has always been impossible to fully absorb the information by which
we are surrounded – still more so to be “fully informed”– he palpable
information overload associated with the digital, multi-channel era has made us
aware as never before of this impossibility. In his book Data Smog, David Shenk observed that “It is
estimated that one weekday edition of today’s New York Times contains more
information than the average person in seventeenth-century England was likely
to come across in a lifetime.”6 He does not say who did the estimating – and it is a formulation whose
credibility, such as it is, depends on a particular definition of information: “in mass mediated form.” Surely during the
17th century people were absorbing all kinds of information directly from the world around them, as we do today through
the course of our daily lives. There is little indication that our sensory apparatus has become more finely tuned or
capacious. However, the amount of mediated information – that which we self-consciously reflect upon as information
presented to us in constructed and contrived formats (TV shows, movies, newspapers, Tweets, status updates, blogs, text
messages, and so on) via various devices including televisions, radios, computers, and so on – has surely increased
dramatically, thanks in no small part to the proliferation of portable, networked, interactive devices. Even
before
the advent of these devices, all we had to do was go to the library to feel
overwhelmed by more than we could possibly absorb. Now this excess confronts
us at every turn: in the devices we use to work, to communicate with one another,
to entertain ourselves. Gult is no longer a “pull” phenomenon but a “push” one. We don’t go to it, it comes to us.
It is the mediated atmosphere in which we are immersed. When all we had to do to keep up with the news, for example,
was to read a daily newspaper and watch the network evening news, it was easier to imagine the possibility that someone
like Walter Cronkite could tell us “the way it is” during the half-hour interlude of an evening newscast. By
the first
decade of the 21st century, the era of the most-trusted man in America was long
gone, as evidenced, for example, by a poll revealing that despite (or perhaps
because of) the proliferation of hours devoted to television news, not one major
news outlet was trusted by the majority of the American people. Poll upon poll have
revealed declining levels of public trust in news outlets and a heightened sense of perceived bias on the part of journalists.
The researcher responsible for a 2008 poll noted that “an astonishing percentage of Americans see biases and
partisanship in their mainstream news sources” presumably because, “The availability of alternative viewpoints and news
sources through the Internet ... contributes to the increased skepticism about the objectivity of profit-driven news outlets
owned by large conglomerates.”7 It
is not just that there is more information available, but
that this very surfeit has highlighted the incompleteness of any individual
account. An era of information overload coincides, in other words, with the
reflexive recognition of the constructed and partial nature of representation.
Surveillance is no exception – the normalization of intrusive
surveillance programs is justified under the pre-emptive logic of
deterrence – we are convinced that once we know everything, we can
prevent all crime
Andrejevic 13 – Honorary Research Associate Professor, Centre for Critical and Cultural
Studies; media scholar who writes about surveillance, new media, and popular culture (Mark,
“InfoGlut,” "Foreknowledge is Supremacy") //RGP
These formulations bring us a bit closer to the notion of deterrence: a kind of preemption of the core experience of desire itself – what gets averted is the moment
of lack with which this experience coincides. Taken to its limit, the goal is to
relegate desire to the pre-empted status of crime in “The Minority Report”: “pure metaphysics.”
If you’re jailed before you committed the crime, are you really guilty? If the database
knows what you want before you do, did you really want it? The lack is filled before it is subjectively
perceived. Would the crime really have happened? Was the desire really there? Is that purchase/
search term really what the subject wanted prior to the precipitation of the moment of consumption or search? The
mutants and the statisticians say yes, and who is to prove them wrong? In
his description of what he
describes as the simulation of surveillance, Bogard argues that predictive
analytics is not simply about predicting outcomes, but about devising ways of
altering them. In policing terms, the goal of predicting the likelihood of criminal
behavior is to deter it. In marketing or campaigning terms it is to anticipate desire before it happens – to
precipitate an accelerating range of latent desires that were allegedly “already there.” Transposed into business jargon, as
one digital marketing executive put it, “In the early days of digital marketing, analytics emerged to tell us what happened
and, as analytics got better, why it happened. Then solutions emerged to make it easier to act on data and optimize
results.”33 The more data that can be processed faster, the better for turning “big data into a big opportunity.”34 The
promise of predictive analytics is to incorporate the future as a set of anticipated
data points into the decision-making process: “Historically all Web analytics have reflected data
from the past which has been to a certain extent like driving a car using only the rear view mirror … for the first time we
can be marketers using data in a manner that allows us to drive while facing the road ahead.”35 It is a vision of a future in
which the structure outlined by predictions is subject to modification along certain pivot points. If, for example, a credit
card company can predict a scenario that might lead to losses, it can intervene in advance to attempt to minimize these, as
in one example described by the New York Times: “credit-card companies keep an eye on whether you are making
purchases of that kind [indicating marital problems], because divorce is expensive and they are paranoid that you might
stop paying your credit-card bill. For example, if you use your card to pay for a marriage counselor, they might decrease
your credit line.”36 Similarly, in the book Super Crunchers, Ian Ayres describes how the credit card company CapOne uses
data mining to predict the smallest possible reduction in interest rates that can be used to retain customers. When
someone calls in with a complaint about a card’s high interest rates, a computer uses detailed information about the
consumer combined with information about how similar customers have behaved in the past to rapidly generate a range of
rates likely to pre-empt the consumer’s cancellation request: “Because of Super Crunching, CapOne knows that a lot of
people will be satisfied with this reduction (even when they say they’ve been offered a lower rate from another card).”37
The fact that marketing analogies are so frequently used to introduce the topic of
predictive analytics reflects both the pioneering role played by the commercial
sector and what might be described as its ordinariness: the way in which the use of data mining
has incorporated itself into our understanding of how the world works in the digital era. Customized recommendations on
Amazon.com and targeted advertising on the Internet, not to mention targeted mailings and customized coupons in the
supermarket, have become the commonplaces of everyday life in contemporary, information-rich societies. What
once might have seemed slightly creepy – Google scanning our email, for
example, to figure out how best to market to us – has become a normal and
largely overlooked part of daily life for millions of email users. This level of
normalcy helps to pave the way for forms of population-level police surveillance
that might previously have seemed intrusive or otherwise inappropriate. As a report
for the National Institute of Justice put it, “Walmart, for example, learned through analysis that when a major weather
event is in the forecast, demand for three items rises: duct tape, bottled water and strawberry Pop-Tarts … Police can use a
similar data analysis to help make their work more efficient … some in the field believe it has the potential to transform
we are
submitting to detailed monitoring to help enhance Pop-Tart sales, surely we can
do it for public safety and national security. Viewed from a slightly different
perspective, it is hard to avoid the notion that we are living in an era of rampant
surveillance creep. Whereas once upon a time it might have seemed strange to allow police to scan and store
law enforcement by enabling police to anticipate and prevent crime instead of simply responding to it.”38 If
license plate numbers of everyone who drives by a particular location, this now takes place as a matter of course.
Predictive policing, in this regard, is just piggybacking on the “new normal” of digital, interactive monitoring.
Reinforcing this normality are the claims made on behalf of predictive policing. In
Boston, officials reported that serious crime in the Cambridge area in 2011 dropped to its lowest level in 50 years after
police adopted a data-driven predictive policing program (tellingly, the claim does not distinguish between correlation and
causation). The murder rate actually increased – but police said this was a result of domestic disputes that they could not
(yet?) predict.39 In Santa Cruz, police reported a significant drop in burglaries after adopting a predictive policing
program developed by mathematicians, an anthropologist, and a criminologist based on models for predicting earthquake
aftershocks.40 In Memphis, officials reported a 15 percent drop in serious crime over four years after adopting a databasedriven predictive policing program.41 Police are experimenting with a growing range of variables to predict crime, ranging
from weather patterns to building code violations. In Arlington, Texas, police reported that every unit increase in physical
decay of the neighborhood (measured by code violations) resulted in six more residential burglaries in the city.42 For the
moment, however, the most common indicator seems to be past patterns of behavior. In Santa Cruz, for example, two
women were taken into custody for peering into cars in a parking garage that the computer indicated would be at risk for
burglaries that day: “One woman was found to have outstanding warrants; the other was carrying illegal drugs.”43 If,
for the moment, the methodologies seem relatively crude (but potentially effective – at least
on occasion), it is worth keeping in mind that current systems rely on only
a tiny current of the swelling information flood . However, recent regulatory
shifts propose to make much more data available. As of this writing, legislators in the
UK have proposed giving intelligence agencies access to the phone records,
browsing details, emails, and text messages of all Britons without a warrant.44 In
the US, updated “guidelines” for the National Counter Terrorism Center allow the
organization to collect data about any American without a warrant and keep it for
up to five years. It also permits the center to data mine this information for the purposes of investigating
terrorism.45 Total Information Awareness as a named program may have
disappeared, but as an unnamed initiative it continues to develop
apace .
With increasing floods of information comes a widening power gap –
those with access to bulk data gain advantage over those without it
Andrejevic 13 – Honorary Research Associate Professor, Centre for Critical and Cultural
Studies; media scholar who writes about surveillance, new media, and popular culture (Mark,
“InfoGlut,” Simulation as Deterrence) //RGP
The question recalls the post-9/11 data-driven plans of Admiral John Poindexter for a Total Information Awareness
program that would sift through a giant database of databases in search of threat indicators. Indeed, the
Wall
Street Journal opens its op-ed piece about the Colorado shooting with the
question, “Would Total Information Awareness have stopped James Eagan
Holmes [the suspect in the Colorado shooting]?”2 Put that way, the question sounds almost
rhetorical: “total information awareness” implies a high degree of predictive
power: if you could keep an electronic eye on everyone’s actions all the time, surely you could unearth the symptoms of
eventual wrongdoing. Set aside for a moment that the version of security on offer requires willing
submission to “total” surveillance and simply consider the fantasy of pre-emption
opened up by the technology: “a future landscape of surveillance without limits –
everything visible in advance, everything transparent, sterilized, and risk-free,
nothing secret, absolute foreknowledge of events.”3 If this sounds futuristic and
vaguely absurd, consider the claims that are currently being made on behalf of
so-called predictive policing, which uses past crime patterns and related data to guide the deployment of
police patrols: “It is now possible to predict the future when it comes to crime, such as identifying crime trends,
anticipating hotspots in the community, refining resource deployment decisions, and ensuring the greatest protection for
citizens in the most efficient manner.”4 It
is perhaps a telling sign of the power of the promise
of new information and communication technologies, based on their ability to
collect, store, and process huge amounts of data, that one of our first reactions to
the unexpected has become: “could the database have predicted it?”– and the automatic
corollary: “could the database have prevented it?” Lurking in these two questions is an
assumption about the character of knowledge in the digital era: the notion that
the only limit on our predictive power is the ability to effectively organize all the
available information. If this were indeed the case, then the development of technological information storage
and processing technology might compensate for the shortcomings of the human brain by ushering in new forms of
aggregate “knowledge” and predictive power. Such
forms of “knowing” would, in a sense, exceed
the limits of human comprehension. It would no longer be a question of comprehending the data or
using it to understand, in referential fashion, the world to which it refers, but rather of putting the data to use. The
promise of automated data processing is to unearth the patterns that are far too complex for any human analyst to detect
and to run the simulations that generate emergent patterns that would otherwise defy our predictive power. The
form
of “knowledge” on offer is limited to those with access to the database and the
processing power, and it replicates the logic of “knowing without knowing”
insofar as it can serve as the basis for decisions while exceeding the processing
power of any individual human brain. In keeping with the logic of digital
convergence, this form of knowledge is portrayed by its proponents as universal
insofar as it is generalizable across the political, economic, and social domains. It
can be used to predict consumer behavior as well as the spread of disease, or the likelihood that someone will need to be
hospitalized within the coming year. Keeping this convergent background in mind, this chapter will focus on the
somewhat narrower example of policing and security in order to explore the knowledge practices associated with data
mining and predictive analytics in the era of “big data.” In particular, the focus will be upon the version of distributed,
predictive “knowledge” that emerges from the database. As McCue puts it in her discussion of the use of predictive
analytics for security purposes, “With data mining we can perform exhaustive searches of very large databases using
automated methods, searching well beyond the capacity of human analysts or even a team of analysts.”5 In the wake of the
development of database technology, there is an emerging tendency to devalue individual comprehension in comparison
with the alleged predictive power derived from “super-crunching” tremendous amounts of data. This development
has significant implications for the promise that because new information and
communication technologies are less or non-hierarchical, they are therefore
forces for democratization and user empowerment. If the (allegedly) more
powerful and productive forms of knowledge associated with “big data” are
limited to those with access to the database and processing power, digital-era
knowledge practices could prove to be even more exclusive and asymmetrical
than those they promise to displace. Widespread access to digital media would go
hand-in-hand with what might be described as the emergence of a “big data”
divide – one that could not be ameliorated by any relatively simple technological
fix (such as more widespread broadband access) or by enhanced forms of education and training. In this respect, the
knowledge practices associated with big data represent a profoundly undemocratic shift insofar as they are reliant upon access to huge and costly
databases as well as to the processing power and technological know-how to
make use of the data.
Something about Baudrillard
Andrejevic 13 – Honorary Research Associate Professor, Centre for Critical and Cultural
Studies; media scholar who writes about surveillance, new media, and popular culture (Mark,
“InfoGlut,” Simulation as Deterrence) //RGP
The French cultural theorist Jean Baudrillard famously defined
simulation as a form of
deterrence, taking as his model the Cold War logic of “mutually assured destruction” (MAD). The deterrent
effect of simulation has been a recurring theme in popular science fiction that
received perhaps its most iconic pop-culture treatment in the movie War Games, which portrays a computer game that
goes awry, accessing the United States missile defense system and transforming a game of simulated nuclear war into the
real thing. Disaster
is averted when the program considers all possible outcomes of
the “game” of global thermonuclear war and discovers that, as in tic-tac-toe, if
both sides play rationally, attempting to win, there can be no winner. The computer,
which is programmed to learn, describes its assessment of “global thermonuclear war” in the movie’s finale: “A strange
game! The
only winning move is not to play.” Or, more accurately, the only right
way to play is by not playing: the game is already being played, as it were, prior to
any missile attack. The logic of mutual assured destruction relies on the ability to avoid a possible future by
modeling it. Simulation stands in for a kind of knowledge about the future that exerts
control in the present: “What stirs in the shadow of this posture under the pretext of a maximal ‘objective’
menace, and thanks to that nuclear sword of Damocles, is the perfection of the best system of control which has never
existed.”6 Simulation
as deterrence, then, operates in a paradoxically counterfactual
realm: that of the proven negative. On the one hand is the promise of information
as control that stipulates a kind of mechanistic causality, on the other is the claim
to intervene in the mechanism of causality itself. This is why, taken to their limits,
strategies of simulation invoke both total control and its eclipse: a kind of
smothering stasis in which all possibilities are fully saturated – everything has
been modeled in advance, including the modeling process itself. As Baudrillard puts it in
his discussion of the virtualization of reality via simulation, “What is the idea of the Virtual? It seems that it would be the
radical effectuation, the unconditional realization of the world, the transformation of all our acts, of all historical events, of
all material substance and energy into pure information. The
ideal would be the resolution of the
world by the actualization of all facts and data.”7 The apparent obstacle to such a
resolution is the limit of human perceptions, analytic ability, and time. The ability to
overcome such limits is relegated to the realm of the superhuman. As Laplace, the pioneer of mathematical probability,
put it, “Given for one instant an intelligence which could comprehend all the forces by which nature is animated and the
respective situation of the beings who compose it – an intelligence sufficiently vast to submit these data to analysis … for
it, nothing
would be uncertain and the future, as the past, would be present to its
eyes.”8 The name for that intelligence, viewed through one historical lens, would
be God. In the digital era, it is the computer and the database. In the era of predictive
analytics, popular fiction continues to experiment with the paradoxes of simulation. Consider, for example, the television
show Person of Interest, in which a renegade computer programmer taps into the government’s data-mining apparatus
(which he created) in order to predict when and where life-threatening crimes will occur. The show’s premise is that
automated surveillance has become both ubiquitous and multifaceted – all spaces and practices are monitored via
technologies ranging from smart cameras equipped with facial recognition technology to telephones with embedded voicestress analyzers. The seemingly distributed network of commercial, public, and personal sensors and communication
devices (closed-circuit surveillance cameras, Webcams, smart phones, and so on) has been covertly colonized by a
centralized monitoring apparatus. This apparatus – which becomes increasingly “subjectivized” over the course of the
series – can watch, listen, and communicate with the main cast members through the full range of networked devices. It is
as if all of our various smart devices have teamed up to create an emergent machine intelligence. The show’s opening
sequence represents the monitoring process at work by portraying the view from the perspective of the all-seeing
surveillance apparatus. We see quick intercut shots of people viewed in grainy surveillance video overlaid with terms
meant to suggest the various forms of monitoring at work: “voice capture stress percentage”; “GPS (global positioning
system): active, tracking location”; “searching: all known databases”; etc. In this world, the
environment itself
has been redoubled as both setting and spectator. No one in particular is
watching, but everyone is watched all the time. The result is what Bogard describes as “the
impersonal domination of the hypersurveillance assemblage.”9 On the show, this assemblage comes to serve as a
technologized version of the mutant, prescient “pre-cogs” in Steven Spielberg’s 2002 movie Minority
Report,
based on the Philip K. Dick story that envisions a world in which crime is prevented before it takes place.
Dick’s story stages the paradox of simulated deterrence in a discussion between two
officials engaged in fighting “pre-crime”: “You’ve probably grasped the basic legalistic drawback to
precrime methodology. We’re taking in individuals who have broken no law … So the commission of the crime itself is
absolute metaphysics. We claim they’re culpable. They, on the other hand, eternally claim they’re innocent. And, in a
sense, they are innocent.”10 The
difference between the modeling of possible futures
proposed by Minority Report and the strategies of simulated deterrence currently
under development in the United States and elsewhere is that between
determinism and probability. The fictional portrayals envision a contradictory
world in which individual actions can be predicted with certainty and effectively
thwarted. They weave oracular fantasies about perfect foresight. Predictive analytics, by contrast, posits
a world in which probabilities can be measured and resources allocated
accordingly. Because forecasts are probabilistic, they never attain the type of certitude that would, for example,
justify arresting someone for a crime he or she has not yet committed. Rather, they distribute probabilities across
populations and scenarios. The
mobilization of such forms of data mining are anticipated
in Michel Foucault’s description of the rise of apparatuses of security, governed by
questions such as “How can we predict statistically the number of thefts at a given moment, in a given society, in a given
town, in the town or in the country, in a given social stratum, and so on? Second, are there times regions, and penal
systems that will increase or reduce this average rate? Will crises, famines, or wars, severe or mild punishment, modify
something in these proportions? … What is the cost of suppressing these thefts … What therefore is the comparative cost
of theft and of its repression …?”11 What
emerges is a kind of actuarial model of crime: one
that lends itself to aggregate considerations regarding how best to allocate
resources under conditions of scarcity – a set of concerns that fits neatly with the
conjunction of generalized threat and the constriction of public-sector funding. The algorithm promises
not simply to capitalize on new information technology and the data it generates, but simultaneously to address
reductions in public resources. The
challenges posed by reduced manpower can be
countered (allegedly) by more information. As in other realms, enhanced information
processing promises to make the business of policing and security more efficient
and effective. However, it does so according to new surveillance imperatives,
including the guidance of targeted surveillance by comprehensive monitoring, the
privileging of prediction over explanation (or causality), and new forms of informational asymmetry. The datadriven promise of prediction, in other words, relies upon significant shifts in
cultures and practices of information collection.\
Authenticity
The glut of information that is fed to us by the surveillance state
makes authenticity impossible – instead we wait to be told what to
desire
Horning 14 (Rob, Executive Editor of The New Inquiry and author of Marginal Utility, citing
Mark Andrejevic, author of Infoglut, 7/10, “No Life Stories,”
http://thenewinquiry.com/essays/no-life-stories//Tang)
Purveyors of targeted marketing often try to pass off these sorts of intrusion and filtering as a kind of manufactured serendipity. Andrejevic
marketing hype inviting us to imagine a world in which retailers
know what consumers want before the consumers do, as though this were a long-yearned-for miracle
of convenience rather than a creepy effort to circumvent even the limited autonomy of shopping sovereignty. “ In the world of
database-driven targeting,” Andrejevic argues, “the goal is, in a sense, to pre-empt
consumer desire.” This is a strange goal, given that desire is the means by which
we know ourselves. In hoping to anticipate our desires, advertisers and the
platforms that serve ads work to dismantle our sense of self as something we
must actively construct and make desire something we experience passively, as a fait
accompli rather than a potentially unmanageable spur to action. Instead of constructing a self through
desire, we experience an overload of information about ourselves and our world,
which makes fashioning a coherent self seem impossible without help. If Big
Data’s dismantling the intrinsic-self myth helped people conclude that
authenticity was always an impossibility, a chimera invented to sustain the
fantasy that we could consume our way to an ersatz uniqueness, that would be
one thing. But instead, Big Data and social media foreground the mediated,
incomplete self not to destroy the notion of the true self altogether but to open us to
more desperate attempts to find our authentic selves. We are enticed into
experiencing our “self ” as a product we can consume, one that surveillance can
supply us with. The more that is known about us, the more our attention can be
compelled and overwhelmed, which in turn leads to a deeper reliance on the
automatic filters and algorithms, a further willingness to let more information be
passively collected about us to help us cope with it all. But instead of leading to resolution, a final discovery of
the “authentic” self, this merely accelerates the cycle of further targeted stimulation. The
cites a series of examples of
ostensible goal of anticipating consumer desire and sating it in real time only serves the purpose of allowing consumers to want something
So as surveillance becomes more and more total, Andrejevic agues, we experience
our increasingly specified and information-rich place in this matrix as confusion,
a loss of clarity or truth about the world and ourselves. Because excess information is
“pushed” at us rather than something we have to seek out, we are always being
reminded that there is more to know than we can assimilate, and that what we
know is a partial representation, a construct. Like a despairing dissertation writer, we cannot help but know
that we can’t assimilate all the knowledge it’s possible to collect. Each new piece of information raises
further questions, or invites more research to properly contextualize it.
Ubiquitous surveillance thus makes information overload everyone’s problem. To
solve it, more surveillance and increasingly automated techniques for organizing
the data it collects are authorized. In a series of chapters on predictive analytics, prediction markets, and bodyelse faster.
language analysis and neuromarketing, Andrejevic examines the variety of emerging technology-driven methods meant to allow data to
By filtering data through algorithms, brain scans, or markets, an
allegedly unmediated truth contained within it can be unveiled, and we can
bypass the slipperiness of discursive representation and slide directly into the
real. Understanding why outcomes occur becomes unnecessary, as long as the probabilities of the correlations hold to make accurate
“speak for itself.”
predictions.
Deliberation/indentity
Big Data kills democratic deliberation and social identity
Horning 14 (Rob, Executive Editor of The New Inquiry and author of Marginal Utility, citing
Mark Andrejevic, author of Infoglut, 7/10, “No Life Stories,”
http://thenewinquiry.com/essays/no-life-stories//Tang)
Far from being neutral or objective, data can be stockpiled as a political weapon
that can be selectively deployed to eradicate citizens’ ability to participate in
deliberative politics. Many researchers have pointed out that “raw data” is an oxymoron, if not a
mystification of the power invested in those who collect it. Subjective choices
must continually be made about what data is collected and how, and about any
interpretive framework to deploy to trace connections amid the information. As
sociologists Kate Crawford and danah boyd point out, Big Data “is the kind of data that encourages the practice of
apophenia: seeing patterns where none actually exist, simply because massive
quantities of data can offer connections that radiate in all directions.” The kinds
of “truths” Big Data can unveil depends greatly on what those with database
access choose to look for. As Andrejevic notes, this access is deeply asymmetrical, undoing
any democratizing tendency inherent in the broader access to information in
general. In his 2007 book iSpy: Surveillance and Power in the Interactive Era, he argues that “asymmetrical
monitoring allows for a managerial rather than democratic relationship to
constituents.” Surveillance makes the practice of “making one’s voice heard”
basically redundant and destroys its link to any intention to engage in
deliberative politics. Instead politics operates at the aggregate level, conducted by
institutions with the best access to the databases. These data sets will be opened to elite researchers and
the big universities that can afford to pay for access, Crawford and boyd point out, but everyone else will be mostly
left on the sidelines, unable to produce “real” knowledge. As a result, institutions with privileged
access to databases will have ability to determine what is true. This plays out not only with
events but also with respect to the self. Just as politics necessarily requires interminable intercourse
with other people who don’t automatically see things our way and who least
acknowledge alternate points of view only after protracted and often painful efforts to spell them out, so
does the social self. It is not something we declare for ourselves by fiat. I need to negotiate who I am
with others for the idea to even matter. Alone, I am no one, no matter how much
information I may consume. In response to this potentially uncomfortable truth,
we may turn to the same Big Data tools in search of a simpler and more directly
accessible “true self,” just as politicians and companies have done. Identity then becomes a
probability, even to ourselves. It ceases to be something we learn to instantiate
through interpersonal interactions but becomes something simply revealed when
sufficient data exists to simulate our future personality algorithmically. One is left to act
without any particular conviction while awaiting report from various recommendation engines on who we really are. In this sense, Big
Data incites what Andrejevic, following Žižek, calls “interpassivity,” in which our belief in
the ideology that governs us is automated, displaced onto a “big other” that does
the believing for us and alleviates us of responsibility for our complicity.
Surrendering the self to data processors and online services make it a product to
be enjoyed rather than a consciousness to be inhabited. The work of selfhood is
difficult, dialectical, requiring not only continual self-criticism but also an aware- ness of
the degree to which those around us shape us in ways we can’t control. We must engage
them, wrestle with one another for our identities, be willing to make the painful surrender of our favorite ideas about ourselves and be
The danger is that we will settle for
the convenience of technological work-arounds and abnegate the duty to debate
the nature of the world we want to live in together. Instead of the collective work
of building the social, we can settle for an automatically generated Timeline and
algorithmically generated prompts for what to add to it. Data analysts can detect a correlation
vulnerable enough to becoming some of what others see more clearly about us.
between two seemingly random points—intelligence and eating curly fries, say, as in a 2012 PNAS research paper by Michal Kosinski, David
Stillwell, and Thore Graepel that made the rounds on Tumblr and Twitter in January—and potentially kick off a wave of otherwise
Advertisers
won’t need a plausible logic to persuade us to be insecure; they can let spurious
data correlations speak for them with the authority of science. Unlike the Facebook moodinexplicable behavior. “I don’t know why I am eating curly fries all of a sudden, but that shows how smart I am!”
manipulation paper, the curly-fries paper enjoyed a miniviral moment in which it was eagerly reblogged for its novelty value, with only a
mild skepticism, if any, attached. This suggests the seductive entertainment appeal these inexplicable correlations can provide—they tap the
emotional climate of boredom to spread an otherwise inane finding that can then reshape behavior at the popular level. We’re much more
likely to laugh about the curly fries paper and pass it on than to absorb any health organization’s didactic nutrition information. Our
eagerness to share the news about curly fries corresponds with our willingness to accept it as true without being able to understand why. It’s
the whimsical reblogging
of the results from patently ridiculous online tests hints at how we may opt in to
more “entertaining” solutions to the problem of self. If coherent self-presentation that considers the
need of others takes work and a willingness to face our own shortcomings, collaborating with social surveillance
and dump ing personal experience into any and all of the available commercial
containers is comparatively easy and fun. It returns to us an “objective” self that is
empirically defensible, as well as an exciting and novel object for us to consume
as entertainment. We are happily the audience and not the author of our life
story. Thus the algorithm becomes responsible for our political impotence, an
alibi for it that lets us enjoy its dubious fruits. By trading narratives for Big Data,
emotions are left with no basis in any belief system. You won’t need a reason to feel anything, and
WTF incomprehensibility enhances its reach and thus its eventual predictive power. Likewise,
feeling can’t serve as a reliable guide to action. Instead we will experience the fluctuation of feeling passively, a spectator to the spectacle of
You can’t know
yourself through introspection or social engagement, but only by finding
technological mirrors, whose reflection is systematically distorted in real time by
their administrators. Let’s hope we don’t like what we see.
our own emotional life, which is now contained in an elaborate spreadsheet and updated as the data changes.
Truth testing
Information overload makes testing truth claims impossible – its
used as a tool to protect power
McVey 13 (Alex, University of North Carolina at Chapel Hill, Communication Studies and
Cultural Studies, Graduate Student, Book Review: Infoglut,
http://library.queensu.ca/ojs/index.php/surveillance-andsociety/article/viewFile/andrejevic/infoglut_review//Tang)
massive
proliferation of information transfer and storage through modern technology
impacts our understanding of both communication and critique. The
hyperproliferation of information in the era of the internet and computer data storage has
contributed to a form of information overload called ‘Infoglut.’ This state of
information overload marks what Slavoj Zizek calls the ‘decline of symbolic
efficiency,’ in which the proliferation and accumulation of competing narratives
and truth claims ultimately calls all claims to truth into question. Whereas power
once operated through the establishment of a dominant narrative and the
suppression of alternative narratives, the perpetual availability of competing
claims to truth now makes old strategies of controlling information irrelevant.
Where the task of the powerful was once to prevent new information from circulating that could hurt their interests, the task of
the powerful is now to circulate so much information that any claim to truth can
ultimately be called into question by mobilizing enough data. Controlling information no longer
Mark Andrejevic’s Infoglut offers a theoretically rich account of the modern information landscape, examining how the
requires preventing new information from circulating but rather controlling access to databases and infrastructure capable of storing,
Critical practices of the pre-infoglut era, rooted in
theories of representation, are now reduced to conspiracy theories that dispel all claims to
expertise as a form of hidden ideology, all while positing a new ideological claim
under the postideological guise of prediction or affective certainty. Andrejevic sets out to
examine how our modern condition of infoglut both changes the relationship between power and knowledge and
calls for a re-examination of the role of critique in a time of postnarrative
uncertainty. A variety of case studies demonstrate that, in response to the decline of symbolic
efficiency, practices of analyzing data emerge that displace communication and
deliberation by appealing either to the supposedly value-free calculations of an
algorithm or the market place or to pre-cognitive affect in body language and
neuroscience. Andrejevic looks at the way information is gathered, stored, analyzed, automated and deployed to shine light on the
monitoring, and analyzing massive quantities of data.
varied landscapes of modern information processing, including the way inequalities manifest in the architectures of data storage and
analysis. Andrejevic offers a critique of the democratizing spirit of the internet as a means for the mass distribution of information, arguing
consumers now take part in ubiquitous acts of data sharing in which their
behaviors are monitored, analyzed and used both to predict future behavior and
to help shape future behavior. Looking at the phenomenon of ‘digital convergence,’ the process by which formerly
that
distinct bodies of digital information converge, Andrejevic points out how ubiquitous information storage is becoming across the social
field. By looking at the convergence of data analysis in marketing, politics, surveillance, security, policing and popular culture, Andrejevic
mounts a convincing argument for information management as an all-encompassing and quotidian element of modern power. However,
Andrejevic is quick to warn readers against reading the decline of symbolic efficiency as a totalizing or all-encompassing fact of modern
existence. Far from the perfect crime predicting capacities foretold by the movie Minority Report, Andrejevic reminds readers that there are
Not all narratives can be disturbed through
the proliferation of counter-information, and the decline of symbolic efficiency is
far from complete. Yet despite this caveat, it is possible to detect some totalizing tendencies
imperfections and probabilities distributed across these systems.
in Andrejevic’s account of information overload. For example, in his discussion of data mining,
Andrejevic argues that modern forms of information gathering have displaced old forms
of targeting and surveillance because whereas in old regimes the target had to
first be identified in order to be surveyed, in the modern era of infoglut, data
gathering and analysis of every possible variable allows data experts to predict
who potential suspects may be. As a result, Andrejevic argues, ‘emerging surveillance
strategies will continue to push for data access at the level of entire populations as
opposed to, say, that of suspicious (or, from a marketing perspective, desirable) groups or individuals’ (36). While Andrejevic is right to
argue that surveillance practices have taken on new forms in an era of modern infoglut, it would be a mistake to overemphasize these new
surveillance still works to
disproportionately target differently racialized, gendered and abled bodies. While it is
forms as a break from previous surveillance strategies. Doing so would miss the way in which
true that more and more members of the population increasingly participate in data transfers that are stored, monitored and analyzed by
data experts, racial, gender, class and other values can nevertheless be inscribed into the information processing systems Andrejevic
describes. Future scholars should build on Andrejevic’s work by pointing to how these axes of difference influence modern conditions of
infoglut. Andrejevic’s book should likewise be required reading for those attempting to engage in leftist politics in a digital age. Andrejevic
the ‘postmodern right’ has coopted the process of critique in the name of
conspiracy theory and encouraged the proliferation of multiple truths in order to
swamp any attempt to make a claim to a truth against the right. Examining, for
example, the strategies of the Bush administration in Iraq, Andrejevic argues that the
postmodern right relies on what Žižek calls the ‘Borrowed Kettle’ strategy in
order to dispel criticism. In the face of critiques of the Bush’s handling of the war in Iraq, the administration
offers multiple contradictory accounts in order to preclude the possibility of
locating one as true and thus being able to pin down the administration’s failures.
argues that
Additionally, Andrejevic looks at Glen Beck’s mobilization of conspiracy theories rooted in affective claims to understanding reality that
strategies that focus on
critiquing their narratives and offering counteracting truth claims merely feed
into the information economy that sustains the postmodern right’s existence.
Additionally, Andrejevic shows how the convergence of data works as a conduit of both
capitalism and the state, as a method of studying populations to predict and
intervene on human behavior. Data mining gathers and stores massive amounts
of consumer data, which in turn gets stored in databases that can be used for
police and security measures. Prediction markets are used to bypass deliberative
processes by depicting the market as a neutral arbiter capable of rising above the
clutter of information overload, cementing antidemocratic, capitalist practices
while laboring under the banner of a market based populism. These interlocking
inequities and power dynamics are all at play in Infoglut. Andrejevic similarly offers a fruitful
interrogation of the role affect plays in negotiating the decline of symbolic efficiency. If the proliferation of
competing claims to truth in an age of infoglut makes representation unreliable,
affect offers a way of cutting through the fog of data by reading and analyzing the
body’s pre-cognitive processes of decision making. Andrejevic examines how new social media
technologies prompt new ways of attempting to analyze and interpret emotion. Here , affective economies encourage
the monitoring of social media both in order to predict and intervene to help
shape popular sentiments. Chapter four studies how the logic of the market comes to
function as a sort of ‘affective fact’ that continues to function even in the face of
the failure of the markets which dispel its main narrative. Andrejevic similarly locates the study of
dispel the knowledge of so-called ‘experts.’ Far from challenging the logic of the postmodern right,
affect in popular culture representations of the reading of body language, such as the TV shows Lie to Me or the World Series of Poker. In all
of these case studies,
affect offers a means of bypassing the unreliable plane of discourse
and representations to give way to a prediscursive understanding of the subject’s
desires, feelings, and behaviors. While much of Andrejevic’s work on affect describes how affect works in a historical
context of infoglut, Andrejevic also offers a bold theoretical move by critiquing affect
theory, with its focus on pre-cognitive intensities rather than rationality and
reason, as implicated in the strategies of neuroscience and neuromarketing used
by the powerful in response to the decline of symbolic efficiency. While this theoretical claim
is perhaps one of the more interesting in the book, Andrejevic seems rushed to lump the entire theoretical trajectory of affect theory
together with the practices of neuromarketing. While both affect theory and neuroscience share an interest in the precognitive elements of
human behavior, it remains to be seen how the particularities of affect theory either permit or challenge the practices of neuromarketing.
While Andrejevic ultimately points toward a more nuanced role for affect theory by arguing for a mode of affect that is neither incompatible
nor identical with reason, more scholarly work should be devoted to studying the links between affect theory and forms of capitalist
advertising and marketing practices that work at the level of affect. Infoglut is an important read for scholars interested in big data, affect
theory, psychoanalysis, Surveillance Studies, and the relationship between data and communication studies. This book is a must read for
communication scholars because it interrogates the ways that strategies of managing information overload attempt to bypass discourse,
representation and deliberation. In the face of an infoglut which thwarts traditional modes of criticism, Andrejevic calls on scholars to ‘gain
control over the forms of postcomprehension knowledge that promise to populate the databases and contest their displacement of
comprehension, models, theories, and narratives’ (164). Infoglut begins this project and marks an outstanding move in that direction.
NEG
A2: overload link turn
Alt Causes
Alt causes – data reduction isn’t enough to solve
Zoldan, 13
Ari Zoldan is an entrepreneur in the technology industry and business analyst based primarily in
New York City and Washington, D.C. “More Data, More Problems: is Big Data Always Right?”
Wired, May 2013, http://www.wired.com/2013/05/more-data-more-problems-is-big-dataalways-right/ // IS
How do we fight the problems of big data? First, we need to approach every data
set with skepticism. You have to assume that the data has inherent flaws, and that
just because something seems statistically right, doesn’t mean it is. Second, you
need to realize that data is a tool, not a course of action. Would you ask your hammer
how to build a house? Of course not! You can’t let the data do the thinking for you, and
can never sacrifice common sense. And third, having a lot of data is good, but what we
need are the means to analyze and interpret it for use.
Inev
Overload is inevitable and quick-fix solutions fail
Bawden & Robinson, Department of Information Science City
University London, ’08 (David Bawden; Lyn Robinson, Department of Information
Science, City University London, “The dark side of information:¶ overload, anxiety and other¶
paradoxes and pathologies” 9/19/2008
http://www.bollettinoadapt.it/old/files/document/21976david_b-2008.pdf) //GY
While it is true to say that overload has been recognised most clearly in the business and
commercial¶ sectors, and in specialist areas such as science and healthcare, it has been a matter
of¶ concern to information specialists in all environments, including academic and public
libraries.¶ It may be argued that information overload is the natural and inevitable
condition of the human¶ species. There has been a consistent viewpoint
suggesting that the issue is exaggerated, or even¶ imagined: see, for example,
Savolainen [23]. Our senses, particularly the visual sense, are able to¶ handle a huge
amount of input, and to identify significant patterns within it. The modern
information¶ environment, however, presents us with information in forms with which our senses,
and prior experiences,¶ are ill-equipped to deal. The causes of overload, in this sense, are
multiple and complex;¶ hence the difficulty in providing any single “quick fix”
solution.¶ It is tempting, and usual, to assume that a major contributing factor, if
not the only significant¶ factor, in information overload is the TMI effect: “too
much information”. This is readily supported¶ by statistics of the sort often quoted [17]:¶ • a
weekly edition of the New York Times contains more information than the average person was ¶
likely to come across in a lifetime in seventeenth-century England¶ • the English language of the
late 20th century contains about 50,000 words, five times more than ¶ in Shakespeare’s lifetime¶ •
the collections of the large US research libraries doubled between 1876 and 1990 ¶ • over one
thousand books were published each day across the world during 1990 ¶ • more information has
been created in the past 30 years than in the previous 5,000 years¶ • the number of records in
publicly available online databases increased from 52 million in 1975 ¶ to 6.3 thousand million in
1994¶ • the number of documents on the Internet doubled from 400 million to 800 million from
1998 to 2000¶ • it would take over 200,000 years to ‘read all the Internet’, allowing 30 minutes
per document.¶ Increasing diversity of information can also lead to overload, partly
by virtue of a consequent¶ increase in the volume of information on a given topic,
which may come from varying perspectives,¶ but also because of an intellectual
difficulty in fitting it within a cognitive framework appropriate¶ for the use and
the user. Diversity may occur both in the nature of the information itself, and in the ¶ format in
which it appears, with a typical business user having to deal with paper, e-mail, voicemail,¶
traditional websites, and so on, to which the newer blogs, wikis and the like must be added.¶ New
information and communication technologies, aimed at providing rapid and
convenient access¶ to information, are themselves responsible for a high
proportion of the overload effect: see, for example,¶ Allen and Shoard [24]. Certain kinds of
technology are generally highlighted in this respect, particularly ¶ “push” systems, which actively
deliver information to the user without any request for it. While the volume of information
available for search and retrieve at the user’s discretion—“pull”—may be so large¶ as to be
daunting, there is not the same sense of information constantly arriving without
being under¶ the user’s control as with the active delivery systems. E-mail is usually
regarded as the worst offender,¶ particularly with overuse of “blanket” e-mail or needless “cc-ing”
of messages.
More data good
Bitcoin avoids PRISM compliance—the plan prevents federal data
collection on it through domestic companies
Neagle, 13
(Colin, 6-12-13, Network World, “Bitcoin isn't PRISM-proof”,
http://www.networkworld.com/article/2167213/software/bitcoin-isn-t-prism-proof.html, amp)
In the aftermath of the revelation of PRISM, the NSA spying program that collects user data
from nine major U.S. tech companies, many have highlighted alternate options from
organizations that are not known to be cooperating with government surveillance
efforts.
Among those alternatives, Bitcoin has been pegged as a more private payment option.
At Prism-Break.org, which lists alternatives to all the services that fall under the PRISM umbrella,
Bitcoin is the only listed alternative to online payment services, such as PayPal and Google Wallet.
Bitcoin ATM is 'horrible for money laundering', says co-creator
But users should know that Bitcoin is not as anonymous as it seems, and while there is no
evidence that Bitcoin services are collaborating with federal agencies, information
on Bitcoin transactions is readily available to them on the Internet.
A 2011 study conducted by University College Dublin researchers Fergal Reid and Martin
Harrigan concluded that although anonymity has been one of Bitcoin’s main selling points,
“Bitcoin is not inherently anonymous.”
“We have performed a passive analysis of anonymity in the Bitcoin system using publicly available
data and tools from network analysis,” the researchers wrote in a blog post. “The results show that
the actions of many users are far from anonymous. We note that several centralized services, e.g.
exchanges, mixers and wallet services, have access to even more information should they wish to
piece together users' activity. We also point out that an active analysis, using say marked Bitcoins
and collaborating users, could reveal even more details.”
In 2012, the publicly available data on Bitcoin transactions was used by researchers Adi Shamir
and Dorit Ron to identify the first ever transaction on the network, which is believed to be from
an account held by Bitcoin’s mysterious creator, known only as Satoshi Nakamoto. While these
transactions were covered up quite well, Ron and Shamir concluded that they are not entirely
untraceable.
“Finally, we noted that the subgraph which contains these large transactions along with their
neighborhood has many strange looking structures which could be an attempt to conceal the
existence and relationship between these transactions, but such an attempt can be foiled by
following the money trail in a sufficiently persistent way,” the report explains.
This may not come as a surprise to the most passionate members of the Bitcoin community, who
look at Bitcoin as a movement to revolutionize online payments, rather than a tool to remain
anonymous on the Internet. Zach Harvey, co-founder of Lamassu and co-creator of the Bitcoin
ATM, says Bitcoin is actually “horrible for money laundering” because the veil of anonymity can
be lifted.
Indeed, late last month the online currency exchange service Liberty Network, which is similar to
Bitcoin, was infiltrated by international law enforcement agencies that allege it laundered more
than $6 billion in money for criminal organizations. The investigation was brought down after an
undercover agent created an account on Liberty Network and listed the purpose as “cocaine.”
Basically, if independent researchers can trace Bitcoin transactions back to the people
responsible, and the U.S. government can investigate digital currencies hosted overseas (Liberty
Network was based in Costa Rica), then the NSA, CIA, FBI or any other federal agency can
likely peek into Bitcoin activity as well.
Extra-PRISM authorities are key to investigate the Dark Web
Green, 15
(Shemmyla, 4-19-15, Cyberbear Tracks, “Exploring the Deep Web”,
http://cyberbeartracks.com/?p=545, amp)
Darknet is a subsection of Deep Web that is accessed by Tor. Tor is a web browser, like
Chrome or Safari, and free software that helps you defend against traffic analysis, a form of
network surveillance that threatens personal freedom and privacy, confidential business activities
and relationships, and state security. It sends Internet data through a series of ‘relays’, adding
extra encryption, making web traffic practically impossible to trace. This is the place where much
the
FBI truly seized the Silk Road’s servers illegally and based off what has
been discovered about NSA and Prism , the answer is no.
of the anonymous dark, perverted, creepy and illegal activity is, but is it truly anonymous? If
“There is no such thing as really being anonymous on the Internet. If [hackers and government
agencies] want you, they will get you. At the moment the Tor network’s security has never been
broken, but there are flaws around it that can be exploited,” Andy Malone, of Microsoft Enterprise
Security and founder of the Cyber Crime Security Forum, said at the Microsoft TechEd North
America 2014.
Now let’s discuss PRISM and the NSA. PRISM is a tool used by the US National Security
Agency (NSA) to collect private electronic data belonging to users of major Internet services like
Gmail, Facebook, Outlook, and others. It’s the latest evolution of the US government’s post-9/11
electronic surveillance efforts, which began under President Bush with the Patriot Act, and
expanded to include the Foreign Intelligence Surveillance Act (FISA) amended in 2006 and 2007.
NSA programs collect two kinds of data: metadata and content. Metadata is the sensitive
byproduct of communications, such as phone records that reveal the participants, times, and
durations of calls; the communications collected by PRISM include the contents of emails, chats,
VoIP calls, cloud-stored files, and more. This method of catching criminals appears very intrusive
to the average law abiding citizen and is a violation of our 4th Amendment rights. In order to
obtain search warrant law enforcement officers must:
1. Have
probable cause to believe a search is justified.
2. Support this showing with sworn statements (affidavits), and
3. Describe in particularity the place they will search and the items they will seize .
Not the polar opposite method.
Bitcoin and the dark web cause lone wolf terrorism
Terence Check 13, J.D. Candidate, Cleveland-Marshall College of Law, 5/5/13, “Shadow
currency: how Bitcoin can finance terrorism,” http://theworldoutline.com/2013/05/shadowcurrency-how-bitcoin-can-finance-terrorism/
This “crypto-currency” has already been the inspiration for several online robberies where cyber-thieves hack into a
computer to steal the vital electronic information at the heart of Bitcoins. Beyond cyber-larceny, the secrecy of Bitcoin
poses unique , and even frightening security challenges for a world that has yet to fully understand the
problems posed by the internet age.
For example, consider the various national and international anti-money laundering statutes.
These laws seek to prevent the illegal flow of currency between criminals, terrorists and other
unsavory characters. But these laws require that there are actual shipments of cash between
countries and criminal networks (or at the very least funds transfers between banks).
The Bitcoin protocol promises
to remove the fundamental risk in money laundering :
the risk of interception and detection. By using a monetary exchange like Mt.Gox, criminals can buy
Bitcoins at the market rate and then they can sell to a confederate across the world at a higher price, effectuating the
exchange of money. Even if Bitcoin performs poorly, it nevertheless provides an opportunity to exchange money via the
anonymous P2P network.
The Silk Road can make Bitcoin even more insidious . While the Silk
Road, as site policy, forbids the sale of destructive items (stolen credit cards, explosives, etc.), it
could be a matter of time before a similar website arises. Then, the firearms laws of
the Western world will become virtually useless. Guns can be disassembled, and their parts shipped
piecemeal through the postal service. Even substances like Tannerite could be bought and
shipped across the globe, providing new opportunities for destructive capacity. If
this alone is not enough to compel attention to the growing black market on
cyberspace, consider the following.
Bitcoin can make security and law enforcement measures less effective by simply
removing the possibility of detection. Terrorist cells or lone wolf operators can get
supplies and currency by using the anonymous underbelly of the internet.
Government agents are able to detect terrorists through logistical networks (Usama
bin Laden was found through his courier). Counter-terrorism, for better or worse, succeeds when it
has human networks to exploit. Terrorists need accomplices , handlers, recruits, and
suppliers. Sooner or later, one of the individuals in this vast network becomes frightened or disillusioned with the cause
and becomes a government informant. Remove
the extended logistical network that exposes
terrorists to investigation at a critical juncture (where their plans are neither theoretical nor well-supplied
enough to implement) and there may be grievous results .
So what legal paths can be utilized to make sure such a development does not occur? The easiest and most
effective way to deal with this threat is to make sure that it never comes into
fruition. The Silk Road is difficult to take down given its place within the “Deep Internet”, but an arms-trading
counterpart may be more susceptible to infiltration and dismemberment.
The second option spells doom for electronic currencies. Much like domestic laws that flag large banking transactions,
governments and the private sector can collude to run Bitcoin out of the currency market .
Simply put, laws could be passed that force banks to reject bitcoin transactions . Thus, even if
Bitcoins continue to be traded, there is no way to turn them back into real
currency. The final approach would require nations to expand the police power of domestic and foreign intelligence
agencies on the web. While there is a visceral aversion to government personnel infiltrating internet communications, the
ultimate security benefits may outweigh the cost to certain freedoms.
Metadata is the crux of counterterrorism—key to hindsight and
prediction
Hines, 13
(Pierre, defense council member of the Truman National Security Project, 6-19-13, Quartz,
“Here’s how metadata on billions of phone calls predicts terrorist attacks”,
http://qz.com/95719/heres-how-metadata-on-billions-of-phone-calls-predicts-terrorist-attacks/,
amp)
Yesterday, when NSA Director General Keith Alexander testified before the House Committee
on Intelligence, he declared that the NSA’s surveillance programs have provided “critical
leads to help prevent over 50 potential terrorist events.” FBI Deputy Director Sean Boyce
elaborated by describing four instances when the NSA’s surveillance programs have had an
impact: (1) when an intercepted email from a terrorist in Pakistan led to foiling a
plan to bomb of the New York subway system; (2) when NSA’s programs helped
prevent a plot to bomb the New York Stock Exchange; (3) when intelligence led to
the arrest of a U.S. citizen who planned to bomb the Danish Newspaper office that
published cartoon depictions of the Prophet Muhammad; and (4) when the NSA’s
programs triggered reopening the 9/11 investigation.
So what are the practical applications of internet and phone records gathered from two NSA
programs? And how can “metadata” actually prevent terrorist attacks?
Metadata does not give the NSA and intelligence community access to the content of
internet and phone communications. Instead, metadata is more like the transactional
information cell phone customers would normally see on their billing statements—metadata can
indicate when a call, email, or online chat began and how long the communication lasted. Section
215 of the Patriot Act provides the legal authority to obtain “business records” from phone
companies. Meanwhile, the NSA uses Section 702 of the
F oreign I ntelligence S urveillance
A ct to authorize its PRISM program. According to the figures provided by Gen.
Alexander, intelligence gathered based on Section 702 authority contributed in over 90%
of the 50 cases.
One of major benefits of metadata is that it provides hindsight—it gives intelligence
analysts a retrospective view of a sequence of events. As Deputy Director Boyce discussed, the
ability to analyze previous communications allowed the FBI to reopen the 9/11
investigation and determine who was linked to that attack. It is important to recognize
attacks are not orchestrated overnight ; they take months or years
to plan. Therefore, if the intelligence community only catches wind of an attack halfway
into the terrorists’ planning cycle, or even after a terrorist attack has taken place,
metadata might be the only source of information that captures the sequence of
events leading up to an attack. Once a terrorist suspect has been identified or once
an attack has taken place, intelligence analysts can use powerful software to
that terrorist
sift through metadata to determine which numbers, IP addresses, or
individuals are associated with the suspect . Moreover, phone numbers and IP
addresses sometimes serve as a proxy for the general location of where the planning has
taken place. This ability to narrow down the location of terrorists can help determine
whether the intelligence community is dealing with a domestic or international
threat.
Even more useful than hindsight is a crystal ball that gives the intelligence
community a look into the future. Simply knowing how many individuals are in a
chat room, how many individuals have contacted a particular phone user, or how many
individuals are on an email chain could serve as an indicator of how many terrorists
are involved in a plot. Furthermore, knowing when a suspect communicates can help
identify his patterns of behavior. For instance, metadata can help establish whether
a suspect communicates sporadically or on a set pattern (e.g., making a call every
Saturday at 2 p.m.). Any deviation from that pattern could indicate that the plan
changed at a certain point; any phone number or email address used consistently and then not
at all could indicate that a suspect has stopped communicating with an associate. Additionally, a
rapid increase in communication could indicate that an attack is about to happen.
Metadata can provide all of this information without ever exposing the content of a phone call or
email. If the metadata reveals the suspect is engaged in terrorist activities, then
obtaining a warrant would allow intelligence officials to actually monitor the content
of the suspect’s communication.
In Gen. Alexander’s words, “These programs have protected our country and allies . . . [t]hese
programs have been approved by the administration, Congress, and the courts.” Now, Americans
will have to decide whether they agree.
Mass collections solves terror – all data is important
Schulberg, Huffington Post, 5/10 (Jessica Schulberg, correspondent Huffington Post,
MA international politics “Richard Burr Says 9/11 Could Have Been Preventable With Mass
Surveillance” 05/10/2015 http://www.huffingtonpost.com/2015/05/10/burr-patriot-act911_n_7251814.html) //GY
Sen. Richard Burr (R-N.C.) said on Sunday that the Sept. 11, 2001, attacks may have been
preventable if the bulk phone collection program that exists today under the Patriot
Act was in effect back then.¶ Speaking on ABC’s “This Week,” Burr, who chairs the Senate
Intelligence Committee, rejected the idea of returning to a narrower surveillance
program that would only collect data on people suspected of being terrorists.
“That turns us back to pre-9/11,” said Burr. “It was very time consuming, it was
cumbersome.”¶ Explaining the decision to pass the Patriot Act, Burr said, “ What we looked at
was the impact of 9/11 and the fact that we might have been able to stop
9/11, had we had bulk collection .Ӧ Three sections of the Patriot Act, the law passed
immediately after the attacks, are set to expire June 1 (but May 22 is the last day Congress has to
act before going into recess). One key provision that is set to expire is Section 215, which has
served as the legal justification for the government’s phone records collection program .¶ "I do
think it should continue for the simple reason that it's very effective at keeping
America safe," Burr said Sunday. "And in addition to that, we've had absolutely no
incident of anybody's privacy being intruded on."¶ The already contentious debate
about whether to reauthorize the program has been further complicated by Thursday’s federal
appeals court ruling, which found that Congress did not authorize the phone collections program
in its current form when it passed the Patriot Act.¶ Sen. Ron Johnson (R-Wis.), chairman of the
Senate Homeland Security Committee, was quick to note that the court’s ruling did not
definitively rule out the legality of such a program.¶ "It's important to note that the Second Circuit
Court of Appeals did not rule it unconstitutional," he said Sunday on CNN’s “State of the Union."
"They just said it was not being applied properly based on the law that was written. So we need to
take a very careful look at the way we write these, quite honestly, very complex laws."¶ Johnson
criticized Edward Snowden's revelations about the program as "demagoguery" that has "done
great harm to our ability to gather information." He added, "Our best line of defense, trying
to keep this nation safe and secure, is an effective intelligence-gathering
capability, with robust congressional oversight."¶ Sen. Ron Wyden (D-Ore.) promised on
Sunday to filibuster a reauthorization of the Patriot Act unless it includes significant reforms.¶
Squo solves – Big Data
No NSA overload – Big Data solves
Rosenbach et al. 13 (Marcel Rosenbach, German journalist, Holger Stark, Professor at the
University of Göttingen, and Jonathan Stock, German Journalist,6/10, “Prism Exposed: Data
Surveillance with Global Implications”, http://www.spiegel.de/international/world/prism-leakinside-the-controversial-us-data-surveillance-program-a-904761-2.html//Tang)
It is now clear that what experts suspected for years is in fact true -- that the NSA monitors every form of electronic
communication around the globe. This fact raises an important question: How can an intelligence agency, even
one as large and well-staffed as the NSA with its 40,000 employees, work meaningfully with such
a flood of information? The answer to this question is part of a phenomenon that is currently a major topic
for the business community as well and goes by the name "Big Data." Thanks to new database technologies, it
is now possible to connect entirely disparate forms of data and analyze them automatically. A rare
glimpse into what intelligence services can do by applying this "big data" approach came last year from David Petraeus.
This new form of data analysis is concerned with discovering "non-obvious relationships," the then
freshly minted CIA director explained at a conference. This includes, for example "finding connections
between a purchase here, a phone call there, a grainy video, customs and immigration
information." The goal, according to Petraeus, is for big data to "lead to automated discovery, rather than
depending on the right analyst asking the right question." Algorithms pick out connections
automatically from the unstructured sea of data they trawl. "The CIA and our intelligence community
partners must be able to swim in the ocean of 'Big Data.' Indeed, we must be world class swimmers -- the best, in fact," the
CIA director continued. The Surveillance State The value of big data analysis for US intelligence agencies
can be seen in the amount the NSA and CIA are investing in it. Not only does this include multimilliondollar contracts with providers specializing in data mining services, but the CIA also invests directly, through its
subsidiary company In-Q-Tel, in several big data start-ups. It's about rendering people and their behavior
predictable. The NSA's research projects aim to forecast, on the basis of telephone data and Twitter
and Facebook posts, when uprisings, social protests and other events will occur . The agency is also
researching new methods of analysis for surveillance videos with the hopes of recognizing conspicuous behavior before an
attack is committed. Gus Hunt, the CIA's chief technology officer, made a forthright admission in March: "We
fundamentally try to collect everything and hang onto it forever." What he meant by "everything," Hunt also made clear:
"It is really very nearly within our grasp to be able to compute on all human-generated information," he said. That
statement is difficult to reconcile with the Fourth Amendment to the US Constitution, which guarantees the right to
privacy. This is probably why Hunt added, almost apologetically: "Technology in this world is moving faster than
government or law can keep up."
Squo solves overload – big data
Segal, 14
Mark E. Segal, Chief of the Computer and Information Sciences Research Group at the NSA,
“Guest Editor’s Column,” The Next Wave, 11/28/14,
https://www.nsa.gov/research/tnw/tnw204/article1.shtml // IS
As Big Data analytics become more ubiquitous, concerns naturally arise about
how data is collected, analyzed, and used. In particular, people whose data is stored in
vast data repositories, regardless of who owns the repositories, are worried about potential
privacy rights violations. Although privacy issues are not discussed in detail in this issue of TNW,
an excellent overview of the relevant issues may be found in a report titled "Big Data and privacy:
A technological perspective" authored by the President's Council of Advisors on Science and
Technology and delivered to President Obama in May 2014 [1]. Another useful resource on this
topic and other topics related to Big Data is the article "Big Data and its technical challenges" by
H. V. Jagadish et al. published in the July 2014 issue of Communications of the ACM [2].
According to a 2012 study by the International Data Corporation, there will be
approximately 1022 bytes of data stored in all of the computers on Earth by 2015
[3]. To put that number in perspective, that's more than the estimated 7.5 x 1018
grains of sand on all of the beaches of the Earth [4], and almost as much as the
estimated 1022 to 1024 stars in the Universe [5, 6]. Let's harness the tools and
algorithms currently being used to process Big Data to solve some of our
planet's most critical problems. We hope you find this issue of TNW interesting,
informative, and thought-provoking.
Squo solves - CC
Squo solves overload – cloud computing
Burkhardt, 14
Paul Burkhardt, computer science researcher in the Research Directorate at NSA. He received his
PhD from the University of Illinois at Urbana-Champaign. “An overview of Big Data,” The Next
Wave, 11/28/14, https://www.nsa.gov/research/tnw/tnw204/article2.shtml // IS
The volume and velocity of Big Data is exceeding our rate of physical storage and
computing capacity, creating scalability demands that far outpace hardware
innovations. Just as multicore chips were designed in response to the limits of clock speeds
imposed by Moore’s Law, cloud technologies have surfaced to address the
impending tidal wave of information. The new cloud architectures
pioneered by Google and Amazon extended distributed computing from its roots
in high-performance computing and grid computing, where hardware was expensive
and purpose-built, to large clusters made from low-cost commodity computers,
ushering the paradigm of “warehouse” computing. These new cloud data centers
containing thousands of computer cabinets are patrolled by administrators on
motorized carts to pull and replace failed components.
Squo Solves – gov checks
Squo solves overload – government checks
Gross 13
Grant Gross, citing a former civil liberties watchdog in the Obama White House, “Critics question
whether NSA data collection is effective,” PC World, 6/25/13,
http://www.pcworld.idg.com.au/article/465878/critics_question_whether_nsa_data_collection
_effective/ // IS
But Timothy Edgar, a former civil liberties watchdog in the Obama White House
and at the Office of Director of National Intelligence, partly defended the NSA
collection programs, noting that U.S. intelligence officials attribute the
surveillance programs with preventing more than 50 terrorist actions . Some
critics have disputed those assertions.
Edgar criticized President Barack Obama's administration for keeping the NSA programs secret.
He also said it was "ridiculous" for Obama to suggest that U.S. residents shouldn't be concerned
about privacy because the NSA is collecting phone metadata and not the content of phone calls.
Information about who people call and when they call is sensitive, he said.
But Edgar,
now a visiting fellow at the Watson Institute for International Studies
at Brown University, also said that Congress, the Foreign Intelligence
Surveillance Court and internal auditors provide some oversight of the data
collection programs, with more checks on data collection in place in the U.S. than
in many other countries. Analysts can query the phone records database only if
they see a connection to terrorism, he said.
Squo solves - investment
Squo solves overload – investment
Burkhardt, 14
Paul Burkhardt, computer science researcher in the Research Directorate at NSA. He received his
PhD from the University of Illinois at Urbana-Champaign. “An overview of Big Data,” The Next
Wave, 11/28/14, https://www.nsa.gov/research/tnw/tnw204/article2.shtml // IS
In March 2012, the White House announced the National Big Data Research and
Development Initiative [14] to help address challenges facing the government, in
response to the President’s Council of Advisors on Science and Technology, which concluded the
“Federal Government is under-investing in technologies related to Big Data.” With a budget of
over $200 million and support of six federal departments and agencies, this
initiative was created to:
Advance state-of-the-art core technologies needed to collect, store, preserve,
manage, analyze, and share huge quantities of data;
Harness these technologies to accelerate the pace of discovery in science and engineering,
strengthen our national security, and transform teaching and learning; and
Expand the workforce needed to develop and use Big Data technologies.
As part of the Big Data Initiative, the National Science Foundation (NSF) and the
National Institutes of Health are funding a joint Big Data solicitation to “advance
the core scientific and technological means of managing, analyzing, visualizing,
and extracting useful information from large and diverse data sets.” In addition,
the NSF is funding the $10 million Expeditions in Computing project led by
University of California at Berkeley, to turn data into knowledge and insight, and
funding a $2 million award for a research training group to support training for
students in techniques for analyzing and visualizing complex data.
The Department of Defense (DoD) is also investing $250 million annually to
“harness and utilize massive data in new ways” and another $60 million for new
research proposals. DARPA, the research arm of the DoD, will invest $25 million
annually under its XDATA program for techniques and tools to analyze large
volumes of data, including
Developing scalable algorithms for processing imperfect data in distributed data stores,
and
Creating effective human-computer interaction tools for facilitating rapidly customizable
visual reasoning for diverse missions.
The Department of Energy is similarly providing $25 million in funding to establish
the Scalable Data Management, Analysis and Visualization Institute to develop new tools for
managing and visualizing data.
Squo Solves - research
Squo solves overload – Research Direcorate
NSA, 11
NSA, on a website describing its internal structure for its “premier unclassified event,”
“Government Host Descriptions,” 2011, http://www.ncsi.com/nsabiam11/host_descriptions.html
// IS
In the NSA/CSS Research Directorate, opportunities abound. We are committed
to providing the tools, the technology, and the techniques to ensure the success of
the Agency’s Signals Intelligence and Information Assurance missions now and in
the future. Our vital research program focuses on four critical goals: We develop
the means to dominate the global computing and communications network. We
cope with the overload of information in our environment and turn that
overload to our strategic advantage . We provide the means for ubiquitous, secure
collaboration both within our government and through its interactions with various partners. We
create the means for penetrating into the “hard” targets that threaten our nation wherever,
whenever, or whomever they many be.
Squo solves - tech
New technology means no overload
Rosenbach et al, ’13 (By Marcel Rosenbach, Holger Stark and Jonathan Stock – acclaimed
German political scientists and journalists “Prism Exposed: Data Surveillance with Global
Implications” Spiegel Online International, June 10, 2013
http://www.spiegel.de/international/world/prism-leak-inside-the-controversial-us-datasurveillance-program-a-904761-2.html) //GY
It is now clear that what experts suspected for years is in fact true -- that the NSA monitors every
form of electronic communication around the globe. This fact raises an important question: How
can an intelligence agency, even one as large and well-staffed as the NSA with its
40,000 employees, work meaningfully with such a flood of information?¶ The
answer to this question is part of a phenomenon that is currently a major topic for the business
community as well and goes by the name "Big Data." Thanks to new database technologies , it is
now possible to connect entirely disparate forms of data and analyze them
automatically.¶ A rare glimpse into what intelligence services can do by applying this "big data"
approach came last year from David Petraeus. This new form of data analysis is
concerned with discovering "non-obvious relationships," the then freshly minted CIA
director explained at a conference. This includes, for example "finding connections between a
purchase here, a phone call there, a grainy video, customs and immigration information." ¶ The
goal, according to Petraeus, is for big data to "lead to automated discovery, rather
than depending on the right analyst asking the right question." Algorithms pick out
connections automatically from the unstructured sea of data they trawl. "The CIA
and our intelligence community partners must be able to swim in the ocean of 'Big Data.' Indeed,
we must be world class swimmers -- the best, in fact," the CIA director continued. ¶ The
Surveillance State¶ The value of big data analysis for US intelligence agencies can be
seen in the amount the NSA and CIA are investing in it. Not only does this include
multimillion-dollar contracts with providers specializing in data mining services, but the CIA also
invests directly, through its subsidiary company In-Q-Tel, in several big data start-ups.¶ It's about
rendering people and their behavior predictable. The NSA's research projects aim to forecast, on
the basis of telephone data and Twitter and Facebook posts, when uprisings, social protests and
other events will occur. The agency is also researching new methods of analysis for
surveillance videos with the hopes of recognizing conspicuous behavior before an
attack is committed.¶ Gus Hunt, the CIA's chief technology officer, made a forthright
admission in March: "We fundamentally try to collect everything and hang onto it
forever." What he meant by "everything," Hunt also made clear: "It is really very
nearly within our grasp to be able to compute on all human-generated
information," he said.¶ That statement is difficult to reconcile with the Fourth Amendment to
the US Constitution, which guarantees the right to privacy. This is probably why Hunt added,
almost apologetically: "Technology in this world is moving faster than government or law can
keep up."
Squo solves overload – new tools
Kirby, 13
Bob Kirby, vice president of sales for CDW·G, a leading technology provider to government and
education. “Big Data Can Help the Federal Government Move Mountains. Here's How.,” FedTech
Magazine, 08/01/13, http://www.fedtechmagazine.com/article/2013/08/big-data-can-helpfederal-government-move-mountains-heres-how // IS
The White House took a step toward helping agencies find these technologies
when it established the National Big Data Research and Development Initiative
in 2012. The initiative included more than $200 million to make the most of the
explosion of Big Data and the tools needed to analyze it.
The challenges that Big Data poses are nearly as daunting as its promise is encouraging. Storing
data efficiently is one of these challenges. As always, budgets are tight, so agencies must minimize
the per-megabyte price of storage and keep the data within easy access so that users can get it
when they want it and how they need it. Backing up massive quantities of data heightens the
challenge.
Analyzing the data effectively is another major challenge. Many agencies employ commercial tools
that enable them to sift through the mountains of data, spotting trends that can help them
operate more efficiently. (A recent study by MeriTalk found that federal IT executives think Big
Data could help agencies save more than $500 billion while also fulfilling mission objectives.)
Custom-developed Big Data tools also are allowing agencies to address the need
to analyze their data. For example, the Oak Ridge National Laboratory’s
-Computational Data Analytics Group has made its Piranha data analytics system
available to other agencies. The system has helped medical researchers find a link that can
alert doctors to aortic aneurysms before they strike. It’s also used for more mundane
tasks, such as sifting through résumés to connect job candidates with hiring
managers.
A2: Chinese Cyberwar
No US-China cyber escalation – litany of checks
Lindsay, 15
Jon Lindsay, Assistant Professor of Digital Media and Global Affairs at the University of Toronto,
research scientist with the University of California Institute on Global Conflict and Cooperation,
assistant adjunct professor at the UC San Diego School of International Relations and Pacific
Studies, and an Oxford Martin Associate with the Oxford Global Cyber Security Capacity Centre,
“Exaggerating the Chinese Cyber Threat,” Belfer Center for Science and International Affairs,
Harvard Kennedy School, May 2015,
http://belfercenter.ksg.harvard.edu/publication/25321/exaggerating_the_chinese_cyber_threat.
html // IS
Policymakers in the United States often portray China as posing a serious
cybersecurity threat. In 2013 U.S. National Security Adviser Tom Donilon stated that Chinese
cyber intrusions not only endanger national security but also threaten U.S. firms with the loss of
competitive advantage. One U.S. member of Congress has asserted that China has "laced the U.S.
infrastructure with logic bombs." Chinese critics, meanwhile, denounce Western
allegations of Chinese espionage and decry National Security Agency (NSA)
activities revealed by Edward Snowden. The People's Daily newspaper has
described the United States as "a thief crying 'stop thief.'" Chinese commentators
increasingly call for the exclusion of U.S. internet firms from the Chinese market, citing concerns
about collusion with the NSA, and argue that the institutions of internet governance give the
United States an unfair advantage.
The rhetorical spiral of mistrust in the Sino-American relationship threatens to
undermine the mutual benefits of the information revolution. Fears about the
paralysis of the United States' digital infrastructure or the hemorrhage of its competitive
advantage are exaggerated. Chinese cyber operators face underappreciated
organizational challenges, including information overload and bureaucratic
compartmentalization, which hinder the weaponization of cyberspace or
absorption of stolen intellectual property. More important, both the United
States and China have strong incentives to moderate the intensity of their cyber
exploitation to preserve profitable interconnections and avoid costly punishment.
The policy backlash against U.S. firms and liberal internet governance by China
and others is ultimately more worrisome for U.S. competitiveness than
espionage; ironically, it is also counterproductive for Chinese growth.
No US-China cyber escalation – no incentives and interdependence
Lindsay, 15
Jon Lindsay, Assistant Professor of Digital Media and Global Affairs at the University of Toronto,
research scientist with the University of California Institute on Global Conflict and Cooperation,
assistant adjunct professor at the UC San Diego School of International Relations and Pacific
Studies, and an Oxford Martin Associate with the Oxford Global Cyber Security Capacity Centre,
“Exaggerating the Chinese Cyber Threat,” Belfer Center for Science and International Affairs,
Harvard Kennedy School, May 2015,
http://belfercenter.ksg.harvard.edu/publication/25321/exaggerating_the_chinese_cyber_threat.
html // IS
Many Western observers fear that cyber reform based on the principle of internet sovereignty
might legitimize authoritarian control and undermine the cosmopolitan promise of the
multistakeholder system. China, however, benefits too much from the current system
to pose a credible alternative. Tussles around internet governance are more likely
to result in minor change at the margins of the existing system, not a major
reorganization that shifts technical protocols and operational regulation to the United Nations.
Yet this is not a foregone conclusion, as China moves to exclude U.S. firms such as IBM, Oracle,
EMC, and Microsoft from its domestic markets and attempts to persuade other states to support
governance reforms at odds with U.S. values and interests.
CONCLUSION
Information technology has generated tremendous wealth and innovation for
millions, underwriting the United States' preponderance as well as China's
meteoric rise. The costs of cyber espionage and harassment pale beside the
mutual benefits of an interdependent, globalized economy. The inevitable
frictions of cyberspace are not a harbinger of catastrophe to come, but rather a
sign that the states inflicting them lack incentives to cause any real harm.
Exaggerated fears of cyberwarfare or an erosion of the United States' competitive
advantage must not be allowed to undermine the institutions and architectures
that make the digital commons so productive.
A2: Border Overload
No overload – there are still gaps
AP, 14
Associated Press, “Drones patrol half of Mexico border,” The Daily Mail, 11/13/14,
http://www.dailymail.co.uk/wires/ap/article-2832607/Drones-patrol-half-Mexico-border.html
// IS
The government has operated about 10,000 drone flights under the strategy,
known internally as "change detection," since it began in March 2013. The flights
currently cover about 900 miles, much of it in Texas, and are expected to expand to the
Canadian border by the end of 2015.
The purpose is to assign agents where illegal activity is highest, said R. Gil
Kerlikowske, commissioner of Customs and Border Protection, the Border Patrol's parent agency,
which operates nine unmanned aircraft across the country.
"You have finite resources," he said in an interview. "If you can look at some very
rugged terrain (and) you can see there's not traffic, whether it's tire tracks or
clothing being abandoned or anything else, you want to deploy your resources to
where you have a greater risk, a greater threat."
If the video shows the terrain unchanged, Border Patrol Chief Michael Fisher calls it "proving the
negative" — showing there isn't anything illegal happening there and therefore no need for agents
and fences.
The strategy was launched without fanfare and expanded at a time when President Barack Obama
prepares to issue an executive order by the end of this year to reduce deportations and enhance
border security.
Rep. Michael McCaul, a Texas Republican who chairs the House Homeland
Security Committee, applauded the approach while saying that surveillance
gaps still remain. "We can no longer focus only on static defenses such as
fences and fixed (camera) towers," he said.
A2: NSA good – cyberterror
Alt causes – the aff isn’t enough
Goldsmith, 13
Jack Goldsmith, Henry L. Shattuck Professor at Harvard Law School, “We Need an Invasive
NSA,” New Republic, 10/10/13, http://www.newrepublic.com/article/115002/invasive-nsa-willprotect-us-cyber-attacks // IS
To keep these computers and networks secure, the government needs powerful
intelligence capabilities abroad so that it can learn about planned cyberintrusions. It also needs to raise defenses at home. An important first step is to
correct the market failures that plague cybersecurity. Through law or regulation,
the government must improve incentives for individuals to use security software,
for private firms to harden their defenses and share information with one another, and for
Internet service providers to crack down on the botnets—networks of compromised zombie
computers—that underlie many cyber-attacks. More, too, must be done to prevent insider
threats like Edward Snowden’s, and to control the stealth introduction of
vulnerabilities during the manufacture of computer components—vulnerabilities
that can later be used as windows for cyber-attacks. And yet that’s still not enough. The
U.S. government can fully monitor air, space, and sea for potential attacks from
abroad. But it has limited access to the channels of cyber-attack and cyber-theft,
because they are owned by private telecommunication firms, and because
Congress strictly limits government access to private communications. “I can’t
defend the country until I’m into all the networks,” General Alexander reportedly told
senior government officials a few months ago. For Alexander, being in the network means having
government computers scan the content and metadata of Internet communications in the United
States and store some of these communications for extended periods. Such access, he thinks,
will give the government a fighting chance to find the needle of known malware
in the haystack of communications so that it can block or degrade the attack or
exploitation. It will also allow it to discern patterns of malicious activity in the
swarm of communications, even when it doesn’t possess the malware’s signature.
And it will better enable the government to trace back an attack’s trajectory so
that it can discover the identity and geographical origin of the threat.
SIGINT CP
1nc
The National Security Agency should make SIGINT data available to
relevant industries upon discovery
Shifting from a production focus to a discovery focus in SIGINT is key
to solve overload – data reduction isn’t enough
SIDtoday, 11
The Signals Intelligence Directorate Today Editor, “Is There a Sustainable Ops Tempo in S2? How
Can Analysts Deal With the Flood of Collection? – An interview with [redacted] (conclusion),”
4/16/11, https://s3.amazonaws.com/s3.documentcloud.org/documents/2089125/analyticmodernization.pdf // IS
6. Q: (U) When we last interviewed you in August 2009 — you were Assistant DDAP at the time —
you spoke of the importance of creating a sustainable ops tempo in S2.
Have you succeeded?
A: (U). No. No, I haven't... And it's not just because there's so much happening in the world.
Here's the thing: we have to take a fundamentally different approach to how we do
business. The volume and quality of access we have now gives us an
unprecedented capability to produce some of the best intelligence we ever have,
but the fact is that we are only getting to a fractional portion of what we have
access to. I can't tell you for sure that as good as what we produce is, that it's the absolute best
we can do. It's not sustainable for the workforce or the mission to keep working in
the way we always have.
(U//FOUO) We've embarked on Analytic Modernization. In the past there have
been transformations... modernizations... renovations... it gets tiring! But to a
large degree those efforts have represented supercharging existing capabilities doing more with less. We need to fundamentally change how we interact with
SIGINT.
(U//FOUO) This is a big deal, because for decades we've trained our workforce - and took pride in
the fact - that we had the unique responsibility within the IC to manage this information in
accordance with the 4th Amendment. To that extent we thought of SIGINT as "radioactive" and it
was our job to render the information safe for others. Our process was about managing the
information in a manner consistent with the privacy rights of U.S. persons.
(U//FOUO) I believe we have developed an awareness of the information and an understanding
of our authorities that allows us to think differently about that relationship... and in so doing, to
create an increased capacity and recover more time for the analysts. I don't mean only analysts in
SID, but also analysts in LAD, in NTQC. etc. The key is to better leverage the Intelligence
Community and our partner relationships for the exploitation of SIGINT, which
is something in the past we would not have done to the degree we're proposing - it
was considered our domain (and "radioactive").
(U//FOUO) We must take advantage of the expertise and capabilities in the IC and
our customer base to enhance discovery and capacity, and to make the actionable
information available almost as soon as we encounter it. This is not about turning
NS A into a collection resource for others (although collection is in fact one of our great
strengths and one we are uniquely qualified to undertake) - it's about making sure we don't
expend cryptologic resources doing work that is not uniquely cryptologic in
nature - work that others can do.
(U) Q: Can you give an example?
A: (S//SI//REL) If NSA discovers a pathway into an adversary's information space and we extract
a terabyte of CAD [computer-aided design] drawings of weapon designs, is it of best value for the
IC to have talented NSA analysts work their way through this 1 terabyte of data? Is it uniquely
cryptologic? Or is it of better benefit to the nation that we expose it immediately to the best and
brightest weapon designers in the US government to work on in a collaborative space, to triage,
assess and exploit the value of that information?
(U//FOUO) In so
doing we are not precluding NSA analysts from continued access
to the data, any more than a published SIGINT product report isn't available for
future reference. We are just leveraging the power and expertise of others so that
we can turn our attention to that that which only we can exploit by virtue of our
unique talents.
2nc
Companies respond positively and NSA analysts can go deeper into
actual problems
SIDtoday, 11
The Signals Intelligence Directorate Today Editor, “Is There a Sustainable Ops Tempo in S2? How
Can Analysts Deal With the Flood of Collection? – An interview with [redacted] (conclusion),”
4/16/11, https://s3.amazonaws.com/s3.documentcloud.org/documents/2089125/analyticmodernization.pdf // IS
A: (U//FOUO) If we get access to the best information on a topic, the value will
speak for itself and the customers will make the resources available . It's
like YouTube videos - they "go viral" when people with good reputations recommend a video to
others. If we put the information out there and monitor the customer response to
it, we'll know when to recommend specific items they might want to take a close
look at. Amazon does the same thing by looking at how you react to the products you've looked
at. We can figure out what the communities of interest are for that topic... If the material is in a
foreign language, we have tools they can use to get the gist of it, and if it looks promising, they can
use the National Virtual Translation Center to get it translated. We want our language analysts
focused on uniquely cryptologic problems that rarely boil down to a straightforward translation.
(U//FOUO) So what does this approach accomplish? We've exposed the
intelligence to people who can interpret it and use it, and we've created
opportunities for collaboration. We've also off-loaded the responsibility to
manage the data in our respositories and own the compliance responsibility. If
data is stored in the Library of National Intelligence, someone else is paying for it.
(U//FOUO) In the end, we exist to produce information. The only way to go more
deeply into targets is to avoid getting stuck on production that others can do for
themselves. Our challenge is to be always out looking for something new. We
need to think about problems, not just about production.
(U//FOUO) The collaborative component means it's not an NSA view, a DIA view, etc... Rather,
it's opened up to all on A-Space. I would like to see people log onto A-Space and announce "I want
to share traffic and create a multi-seal report based on joint input." Why not collaborate at
the point of discovery?
Overload is killing effectivity now – a production to discovery shit is
key
NSA, 11
Leaked NSA document, “SIGINT Mission Thread Three,” 8/1/11,
https://s3.amazonaws.com/s3.documentcloud.org/documents/2088968/gladwell-amp-nsa.pdf
// IS
Achieving a Balance Between Discovery and Production
(U//FOUO) The key to good decision
making is not knowledge. It is understanding.
We are swimming in the former. We are desperately lacking in the latter."* In the
afterward to his 2005 #1 national bestseller Blink — The Power of Thinking Without Thinking,
author Malcolm Gladwell provides his perspective on the danger of confusing information
(collection) with understanding (analysis). Gladwell has captured one of the biggest challenges
facing SID today. Our costs associated with this information overload are not only
financial, such as the need to build data warehouses large enough to store the
mountain of data that arrives at our doorstep each day, but also include the more
intangible costs of too much data to review, process, translate, and report. SID's
first strategic goal for 2011-2015, the challenge to revolution analysis, is aimed squarely at this
tension between information and understanding.
(U//FOUO) In order to revolutionize intelligence, we must "fundamentally shift
our analytic approach from a production to a discovery bias, radically increasing
operational impact across all mission domains."** With so much data at our
fingertips, we must learn how to push the lesser value data to the side, move data
that needs less analysis directly to our customers, and provide ourselves the
needed agility to dig deep into the toughest analytic problems to produce
understanding from well-hidden information.
terror nb
The NSA is overloaded – terror surprises are coming absent reform –
the plan’s reduction isn’t sufficient
SIDtoday, 11
The Signals Intelligence Directorate Today Editor, “Is There a Sustainable Ops Tempo in S2? How
Can Analysts Deal With the Flood of Collection? – An interview with [redacted] (conclusion),”
4/16/11, https://s3.amazonaws.com/s3.documentcloud.org/documents/2089125/analyticmodernization.pdf // IS
A: (S//SI//REL) We live
in an Information Age when we have massive reserves of
information and don't have the capability to exploit it. I was told that there are 2
petabytes of data in the SIGINT System at any given time. How much is that?
That's equal to 20 million 4-drawer filing cabinets. How many cabinets per
analyst is that?? By the end of this year, we'll have 1 terabyte of data per
second coming in. You can't crank that through the existing processes and be
effective.
Q: (U) ...So it's a matter of volume?
Not volume alone, but also complexity. We need to piece
together the data. It's impossible to do that using traditional methods. Strong
selectors - like phone numbers - will become a thing of the past. It used to be that
if you had a target's number, you could follow it for most of your career. Not
anymore. My daughter doesn't even make phone calls, and many targets do the same. Also, the
commercial market demands privacy, and this will drive our targets to go
encrypted, maybe into unexploitable realms. Our nation needs us to look for patterns
A: (S//SI//REL)
surrounding a particular spot on Earth and make the connections - who can do that if not us?
And we can't do it using traditional methods.
Q: (U) Looking into the future, is there anything that especially worries you? ...An eventuality
(internal or external) that would make it hard for A&P to continue to put out quality intelligence?
A: (U//FOUO) I'm worried that we have so
much good stuff that we could lock
down analysts and have them just producing product, and something would
jump out and surprise us. So we need the discipline to invest in the wild and the
unknowns.
SIGINT is uniquely key to the war on terror
Shea, 11
Teresa Shea, director of the Signals Intelligence Directorate, “(U) SIGINT Year in Review,
November 2011,” 11/22/11, https://firstlook.org/theintercept/document/2015/05/18/sidtoday2011-review/ // IS
(S//S1//RHL) This has been a milestone year in
the war on terrorism. Certainly the
most powerful and enduring accomplishment was the successful strike against
Osama Bin Laden. For nearly a decade a dedicated group of SIGINT professionals
would not let go of the search, and their persistence paid off in substantive
contributions at critical points on the road to Abbottabad. In the end many of you brought your
expertise to bear in the final weeks and hours, resulting in a tremendous outcome in our
counterterrorism efforts. Even then you didn’t rest on your laurels: you played a
significant supporting role a few months later in the operations against Atiyah
abd-Rahman in Pakistan and Anwar al-Awlaqi in Yemen. Key targets
continue to be removed from the battlefield as a result of your outstanding
SIGINT contributions.
ptx nb
Only the CP avoids the link to politics – the NSA can reform itself
NSA, 9
NSA, “About NSA: Mission,” 1/15/09 https://www.nsa.gov/about/mission/index.shtml // IS
Executive Order 12333, originally issued 4 December 1981, delineates the
NSA/CSS roles and responsibilities. In part, the Director, NSA/Chief, CSS is
charged to:
Collect (including through clandestine means), process, analyze, produce, and disseminate signals
intelligence information and data for foreign intelligence and counterintelligence purposes to
support national and departmental missions;
Act as the National Manager for National Security Systems as established in law and
policy, and in this capacity be responsible to the Secretary of Defense and to the Director,
National Intelligence;
Prescribe security regulations covering operating practices, including the
transmission, handling, and distribution of signals intelligence and
communications security material within and among the elements under control
of the Director of the National Security Agency, and exercise the necessary
supervisory control to ensure compliance with the regulations.
NSA reform doesn’t require debate – empirics
Savage et al., 15
Charlie Savage, Julia Angwin, Jeff Larson, and Henrik Molte, *Washington correspondent for The
New York Times, “Hunting for Hackers, N.S.A. Secretly Expands Internet Spying at U.S. Border,”
The New York Times, 6/4/15, http://www.nytimes.com/2015/06/05/us/hunting-for-hackersnsa-secretly-expands-internet-spying-at-us-border.html // IS
WASHINGTON — Without public notice
or debate, the Obama administration has
expanded the National Security Agency‘s warrantless surveillance of Americans’
international Internet traffic to search for evidence of malicious computer
hacking, according to classified N.S.A. documents. In mid-2012, Justice Department
lawyers wrote two secret memos permitting the spy agency to begin hunting on
Internet cables, without a warrant and on American soil, for data linked to
computer intrusions originating abroad — including traffic that flows to
suspicious Internet addresses or contains malware, the documents show.
Yahoo CP
1nc
Text: The National Security Agency should partially throttle its data
collected from Yahoo.
That eases information overload that cripples counterterrorism
Stanganelli, 13
(Joe, Founder & Principal, Beacon Hill Law, JD in Law from Suffolk University Law School, Crew
Leader Asst., Vacant-Delete Check of US Census Bureau, 11-13-13, All Analytics, “Thinking on
NSA Excess & Enterprise Implications”,
http://www.allanalytics.com/author.asp?section_id=1437&doc_id=269514, amp)
Documents received by The Washington Post describe Muscular, an NSA effort
that infiltrated Google and Yahoo networking traffic. Muscular gave NSA analysts
access to millions of emails, attachments, and other web communications each day – including entire Yahoo mailboxes.
The NSA needed to develop new filtering and distribution systems to process this
data mother lode, as indicated in the documents. Even with these systems, the new
data (particularly from Yahoo) proved too much to handle. Yahoo email began to
account for approximately 25 percent of daily data being processed by the NSA's main
analytics platform for intercepted Internet traffic. Most of the data was more than six
months old and virtually useless. Analysts became so frustrated that they
requested "partial throttling" of Yahoo data .
"Numerous target offices have complained about this collection 'diluting' their
workflow," according to one NSA document. "The sheer volume" of data is
unjustified by its "relatively small intelligence value."
Other NSA data mining programs have overwhelmed the agency, as reported elsewhere. When
spammers hacked a target Yahoo account last year, the account's address book blew up with
irrelevant email addresses. Consequently, the NSA had to limit its address book data collection
efforts to only Facebook contacts.
These broad data sweeps have been significantly less successful than the NSA's
more targeted operations. In an interview with the Daily Caller, former NSA official
William Binney said the NSA's inefficient big data processes crippled its ability to
react to a tipoff about Tamerlan Tsarnaev -– information that could have curtailed
the Boston Marathon bombing.
They're making themselves dysfunctional by collecting all of this data. They've got so
much collection capability but they can't do everything… The basic problem is they
can't figure out what they have, so they store it all in the hope that down the road
they might figure something out and they can go back and figure out what's
happening and what people did.
Still, the White House and other government departments and agencies place the NSA under
what The New York Times calls an intense "pressure to get everything" -- a pressure that has
spawned a data obsession.
The problem with this obsession is twofold. The first issue is the ROI of gathering haystacks -–
resources better spent elsewhere are diverted to finding, gathering, filtering, and
ultimately throttling and fixing oversized and under-relevant data.
Throttling Yahoo data alleviates overload
Gallagher, 13
(Sean, Masters in Communication Design, Engineering Task Manager CBIS Federal, Ars
Technica, “How the NSA’s MUSCULAR tapped Google’s and Yahoo’s private networks”,
http://arstechnica.com/information-technology/2013/10/how-the-nsas-muscular-tappedgoogles-and-yahoos-private-networks/, amp)
Forget the PRISM—go for the clear
The NSA already has access to selected content on Google and Yahoo through its
PRISM program, a collaborative effort with the FBI that compels cloud providers to turn over
select information through a FISA warrant. And it collects huge quantities of raw Internet traffic
at major network exchange points, allowing the agency to perform keyword searches in realtime
against the content and metadata of individual Internet packets.
But much of that raw traffic is encrypted, and the PRISM requests are relatively limited in scope.
So the NSA went looking for a way to get the same sort of access to encrypted traffic to cloud
providers that it had with unencrypted raw Internet traffic. The solution that the NSA and the
GCHQ devised was to tap into the networks of the providers themselves as they crossed
international borders.
Google and Yahoo maintain a number of overseas data centers to serve their international
customers, and Internet traffic to Google and Yahoo is typically routed to the closest data center
to the user. The Web and other Internet servers that handle those requests generally
communicate with users via a Secure Socket Layer (SSL) encrypted session and act as a gateway
to other services running within the data center—in the case of Google, this includes services like
Gmail message stores, search engines, Maps requests, and Google Drive documents. Within
Google's internal network, these requests are passed unencrypted, and requests often travel
across multiple Google data centers to generate results.
In addition to passing user traffic, the fiber connections between data centers are also
used to replicate data between data centers for backup and universal access. Yahoo, for
example, replicates users' mailbox archives between data centers to ensure that they're available
in case of an outage. In July of 2012, according to documents Snowden provided to the
Washington Post, Yahoo began transferring entire e-mail accounts between data centers in its
NArchive format, possibly as part of a consolidation of operations.
By gaining access to networks within Google's and Yahoo's security perimeters, the NSA was
able to effectively defeat the SSL encryption used to protect customers' Web
connections to the cloud providers, giving the agency's network filtering and data
mining tools unfettered access to the content passing over the network. As a
result, the NSA had access to millions of messages and Web transactions per day
without having to use its FISA warrant power to compel Google or Yahoo to
provide the data through PRISM. And it gained access to complete mailboxes of e-
mail at Yahoo—including attachments that would not necessarily show up as part of intercepted
Webmail sessions, because users would download them separately.
But the NSA and the GCHQ had to devise ways to process the streams of data
passing between data centers to make it useful. That meant reverse-engineering
some of the software and network interfaces of the cloud providers so that they
could break apart data streams optimized to be sent across wide-area networks over
multiple simultaneous data links. It also meant creating filtering capabilities that
allowed the NSA and the GCHQ to separate traffic of intelligence interest from
the vast amount of intra-data center communications that have nothing to do
with user activity. So the NSA and the GCHQ configured a "distributed data
distribution system" (as the NSA described MUSCULAR in this FAQ about the
BOUNDLESSINFORMANT metadata search tool acquired by the American Civil
Liberties Union) similar to XKeyscore to collect, filter, and process the content on those
networks.
Mailbox overload
Even with filtering, the volume of that data presented a problem to NSA analysts.
When Yahoo started performing its mailbox transfers, that data rapidly started to
eclipse other sources of data being ingested into PINWALE, the NSA's primary
analytical database for processing intercepted Internet traffic. PINWHALE also pulls
in data harvested by the XKeyscore system and processes about 60 gigabytes of data per day that
it gets passed from collection systems.
By February of 2013, Yahoo mailboxes were accounting for about a quarter of that daily traffic.
And because of the nature of the mailboxes—many of them contained e-mail messages that were
months or years old—most of the data was useless to analysts trying to find current
data. Fifty-nine percent of the mail in the archives was over 180 days old, making it almost
useless to analysts.
So the analysts requested "partial throttling" of Yahoo content to
prevent data overload . "Numerous analysts have complained of [the Yahoo
data's] existence," the notes from the PowerPoint slide on MUSCULAR stated, "and the
relatively small intelligence value it contains does not justify the sheer volume of
collection at MUSCULAR (1/4th of the daily total collect)."
Snowball CP
1nc
Text: United States federal intelligence agencies should cease
snowball sampling wiretapping activities.
That solves overload and poor network identification and is replaced
by other methods
Carley and Tsvetovat, 6
(Kathleen--professor in the School of Computer Science in the department - Institute for Software
Research - at Carnegie Mellon University, Maksim--Ph.D. from Carnegie Mellon University's
School of Computer Science, with concentration on computational modeling of organizations, 30
August 2006, Springer Science + Business Media, LLC, “On effectiveness of wiretap programs in
mapping social networks”, Proquest, amp)
Snowball sampling methods are known to be a biased toward highly connected actors
and consequently produce core-periphery networks when these may not necessarily
be present. This leads to a biased perception of the underlying network which can
have negative policy consequences, as in the identification of terrorist networks.
When snowball sampling is used, the potential overload of the information collection
system is a distinct problem due to the exponential growth of the number of suspects
to be monitored. In this paper, we focus on evaluating the effectiveness of a wiretapping
program in terms of its ability to map the rapidly evolving networks within a covert organization.
By running a series of simulation-based experiments, we are able to evaluate a broad
spectrum of information gathering regimes based on a consistent set of criteria. We
conclude by proposing a set of information gathering programs that achieve higher
effectiveness then snowball sampling, and at a lower cost.
terror nb
Snowball makes overload uniquely dangerous – creates
vulnerabilities
Beutel, 7
Alejandro Beutel, Researcher for Countering Violent Extremism at the National Consortium for
the Study of Terrorism and Responses to Terrorism (START)“Breach of Law, Breach of Security –
NSA Wiretapping,” 1/31/07, http://blog.minaret.org/?p=197 // IS
Second, according to the 2006 USA Today article, NSA officials claimed domestic SIGINT
operations help fight terrorism by using the data produced for “social network analysis.” However
the current social network analysis methods used to guide SIGINT operations called
“ snowball sampling,” (a type of electronic dragnet) are not well suited for the
type of counter-terrorism operations traditionally done by FBI criminal
investigators. Research conducted by two social network experts, Maksim
Tsvetovat and Kathleen Carley [PDF], finds that the snowball method is better
suited for highly connected groups, as opposed to small, loosely connected
cellular networks [PDF] which define Al-Qaeda. The NSA’s snowball sampling
methods gathered a massive volume of useless information that led FBI officials
nowhere, wasting limited resources and time. Furthermore, the domestic SIGINT
operations are put an enormous technical strain on the NSA’s resources, forcing the agency to
consume voracious amounts of electricity–on top of dealing with its current computer problems–
to sustain its current operational capacity. This jeopardizes our national security
by running the risk of another electrical overload, similar to the one that
paralyzed the agency seven years ago and left our nation vulnerable for nearly
three days.
Splunk CP
1nc
Text: The National Security Agency should implement Splunk to
analyze its data.
Solves overload
Trobock, 14
(Randall, Master of Science in Cybersecurity from Utica College, May 2014, “THE APPLICATION
OF SPLUNK IN THE INTELLIGENCE GATHERING PROCESS”, Proquest, amp)
Splunk is a software application that can ingest massive amounts of data from
various sources and conduct analytics on this data, in order to discover relevant
correlations that would have taken days, weeks or months to discover manually.
Splunk has the capability to ingest these data sources into its analysis engine, and
create actionable intelligence based on pre-determined correlations. The idea
behind correlation is to monitor and create alerts for when intelligence sources
intersect. This capability greatly reduces the amount of time required for analyzing
data, thereby minimizing the information overload problem and creating greater
opportunity for taking action when necessary.
Faulty intelligence methods, such as those that would be the result of information overload, pose
a significant threat to peace throughout the world. For example, having inaccurate or incomplete
intelligence on Iran’s nuclear capabilities and the locations or nature of its nuclear plants, a riskaverse Israel might overestimate its need to take both drastic and pre-emptive measures against
Iran (Young, 2013). This could result in involvement from several countries, including the United
States, potentially costing billions of dollars and thousands of soldiers’ lives.
Solves human bias
Trobock, 14
(Randall, Master of Science in Cybersecurity from Utica College, May 2014, “THE APPLICATION
OF SPLUNK IN THE INTELLIGENCE GATHERING PROCESS”, Proquest, amp)
In addition to information overload, there are human mental processes such as perception
and human mindsets and biases that can also hinder the intelligence analysis process. The fact
that the intelligence lifecycle is largely run by human beings makes it prone to a large
amount of subjectivity. Of all the intelligence analysis obstacles that might be presented, those
that lie within human mental processes can be the most complex. Perception, human mindsets,
and cognitive biases are all factors that can affect an intelligence report. With these factors in
play, a person’s own assumptions, biases, and mindsets can affect an intelligence report,
potentially causing action to be taken based on their own perceptions.
Perception plays a significant role in intelligence analysis. What a person perceives will
often be strongly influenced by several external factors, such as past experience, cultural
values, education, and role requirements (Heuer & Center for the Study of Intelligence
(U.S.), 2001, p. 7). An example of human perception having a negative effect on intelligence
analysis would be if an analyst were to make a conclusion that a potentially
catastrophic event was going to take place based their own experience or knowledge
on a given subject. However, the reality could be that the subject within the
intelligence report has completely different intentions. Given the fact that intelligence
consumers must often take action based on the information received in an intelligence analyst’s
report, it is important that an analyst’s perception is not constructing its own
reality when producing an intelligence report for a consumer.
People will often employ various simplifying strategies and rules of thumb to ease the
processing of complex information in the decision making process. These strategies and
rules of thumb can lead to faulty judgments known as cognitive biases. Cognitive
biases can affect the intelligence analysis process within the areas of evaluation of
evidence, perception of cause and effect, estimation of probabilities, as well as
retrospective evaluation of intelligence reports (Heuer & Center for the Study of Intelligence
(U.S.), 2001). One example of a cognitive bias is the focusing effect. This bias refers to the
tendency to place too much importance on one aspect of an event. In an intelligence collection
and analysis scenario, too much focus placed on a single event can allow a more important event
or aspect go unnoticed. Removing subjective aspects from the intelligence analysis process such
as cognitive biases will go a long way in creating actionable reports for the intelligence consumer.
Because Splunk conducts its analytics using predetermined correlations, it has
the ability to minimize the human factor within intelligence analysis. When
intelligence data meets the criteria defined within the predetermined correlations in Splunk, an
alert can be configured to notify the analyst, at which time they would be able to draw an
informed conclusion, without introducing misperceptions or human bias into the
report. These types of conclusions are what provide relevant and actionable intelligence into an
intelligence report.
Examples prove—makes counterterrorism more effective
Trobock, 14
(Randall, Master of Science in Cybersecurity from Utica College, May 2014, “THE APPLICATION
OF SPLUNK IN THE INTELLIGENCE GATHERING PROCESS”, Proquest, amp)
The argument can clearly be made that the problem of information overload that exists in
law enforcement and intelligence analysis can be solved by implementing and using
Splunk to its full potential. In the days and weeks after the Boston bombing attack,
intelligence surfaced through several media sources that would have been invaluable
in possibly preventing this attack from occurring. The intelligence, in the form of social
media posts, suspicious travel, and even an open murder investigation of a known
associate, was wide ranging and would have caused an information overload
situation had the FBI been actively investigating Tsarnaev. Unfortunately, the FBI
had concluded their investigation on Tsarnaev shortly after receiving the initial tip
from the Russian government, and no further investigation was conducted. Had
the FBI continued their investigation and been able to implement Splunk to its full potential,
leveraging it as a resource for proactively following and analyzing Tsarnaev’s actions in the
time preceding the attack, the attack itself could have potentially been averted. 23
An example of information overload was also presented in the case study of the
Chandler Police Department. They were able to gain efficiencies not only in the
health monitoring of their internal infrastructure, but also on the ways they track crime and
reported incidents. This included gathering data from various sources within the department,
which would not have been practical to do without a solution such as Splunk. Based on
the success that they have seen in their implementation, an argument can be made that the
Chandler Police Department would be remiss not to further explore additional uses for Splunk. In
addition to the efficiencies gained in the case study, which focused mainly on gathering metrics
around police processes, the Chandler Police Department could use Splunk to actually prevent
and solve crimes. Police departments need to be able to process the same types of
intelligence that agencies such as the FBI and the Central Intelligence Agency (CIA)
need to process during a typical investigation. Therefore, it would make sense for the
Chandler Police Department to extend their Splunk implementation.
UQ CP
1nc
Text: The National Security Administration should allocate more
funding to data analysis through means included but not limited to
computing power and speed and human resources.
The CP solves—the NSA has a huge budget and tech barriers are
surpassed
Soltani, 13
(Ashkan, independent researcher who previously investigated online privacy issues as a staff
technologist with the US Federal Trade Commission, 8-16-13, MIT Technology Review, “Soaring
Surveillance”, http://www.technologyreview.com/view/516691/soaring-surveillance/, amp)
Each of the NSA programs recently disclosed by the media is unique in the type of data it
accesses, but all of them have been made possible by the same trend: surveillance
techniques have been exploding in capacity and plummeting in cost. One leaked
document shows that between 2002 and 2006, it cost the NSA only about $140 million
to gather phone records for 300 million Americans, operate a system that collects
e-mails and other content from Internet companies such as Google, and develop new
“offensive” surveillance tools for use overseas. That’s a minuscule portion of the
NSA’s $10 billion annual budget.
Spying no longer requires following people or planting bugs; rather, it means filling out forms to
demand access to an existing trove of information. The NSA doesn’t bear the cost of collecting or
storing data and no longer has to interact with its targets. The reach of its new programs is vast,
especially when compared with the closest equivalent possible just 10 years ago.
What we have learned about the NSA’s capabilities suggests it has adopted a style of
programmatic, automated surveillance previously precluded by the limitations of
scale, cost, and computing speed. This is a trend with a firm lower bound. Once the cost of
surveillance reaches zero, we will be left with our outdated laws as the only protection. Whatever
policy actions are taken to regulate surveillance in light of the recent leaks should recognize that
technical barriers offer dwindling protection from unwarranted government surveillance at home
and abroad.
Avoids ptx
Congress easily granted the NSA hundreds of millions before—the CP
is secretive and uncontroversial and obtains funding under the guise
of uncontroversial construction
Deseret Morning News, 9
(12-20-2009, “Big Brother is coming: NSA's $1.9 billion cyber spy center a power grab”, lexis,
amp)
No, this power grab is for the stuff of Thomas Edison and Nikola Tesla ? the juice needed to keep acres of NSA
supercomputers humming and a cyber eye peeled for the world's bad guys. Nearly a decade into the new millennium,
America's spy agency is power gridlocked at its sprawling Fort Meade, Md., headquarters. The NSA, which devours
electricity the same way teenage boys wolf down french fries at McDonald's, has been forced to look elsewhere to feed its
ravenous AC/DC appetite. "At the NSA, electrical power is political power. In its top-secret world, the coin of the realm is
the kilowatt," writes national security authority and author James Bamford. It's a simple equation: More data coming in
means more reports going out. More reports going out means more political clout for the agency, Bamford writes.
Intelligence historian and author Matthew M. Aid considers the NSA's quest for power a driving factor in the NSA's
selection of Camp Williams, which covers 28,000 acres bordering Utah and Salt Lake counties. During an Oct. 23 news
conference at the state Capitol officially announcing the new spy center, Glenn Gaffney, deputy director of national
intelligence for collection, said as much when he confirmed that one of the strengths of the Utah location was that it
"offered an abundant availability of low-cost power." There's been some speculation that the Camp Williams facility
dovetails with the NSA's controversial attempts to further establish itself as the lead dog for the government's expanding
cybersecurity initiatives, although NSA officials aren't tipping their hand. "I can't get into some of the specific details of the
kind of work that will go on at the center because it is a critical aspect of the way we are looking at doing cybersecurity
going forward," Gaffney said in his best NSA-ese. "I can say that the reason why we are doing the center is because of the
deep level of technical expertise that's needed to understand the nature of the threats." Given the NSA's penchant for
speaking little and revealing less, it sounds like he's saying, "Trust us." Zeros gone wild The virtual mountains of data
needing such huge levels of power to mine can be brain-numbing. Think zeros gone wild. A 2008 report by the MITRE
Corp., prepared for the Department of Defense, conservatively estimates that the high volumes of data storage required for
NSA eavesdropping will reach the petabyte level by 2015. A petabyte is one quadrillion bytes (1,000,000,000,000,000).
There has been even wilder speculation that data storage may reach the yottabyte level within that same time frame. A
yottabyte, the largest unit of measure for computer data, equals 1,000,000,000,000,000,000,000,000 bytes. Either way,
the NSA is already drowning in information. The agency's former director, Lt. Gen. Michael V. Hayden, admitted the NSA
"is collecting far more data than it processes, and that the agency processes more data than it actually reports to its
consumers." In his book "The Shadow Factory," Bamford cites an already outdated study by the University of California at
Berkeley that measured global data trends. In 2002, there were 1.1 billion telephone lines in the world carrying nearly 3.8
billion minutes ? approximately 15 exabytes of data. An exabyte is 1,000 petabytes. Cell phones added 2.3 exabytes of data,
while the Internet that year added another 32 petabytes to the bubbling information pot. Suffice it to say that seven years
hence, it's grown to a tsunami-size information wave that's being added to daily, which is where the NSA's robust sifting
technologies of today and the future come into play. "Once vacuumed up and stored in these near-infinite 'libraries,' the
data are then analyzed by powerful info weapons, supercomputers running complex algorithmic programs, to determine
who among us may be ? or may one day become ? a terrorist," Bamford writes. "In the NSA's world of automated
surveillance on steroids, every bit has a history, and every keystroke tells a story." Bigger Brother, if you will, once only
found in literature and the lexicon, has now taken up full-time residency in our daily lives. Power shortage The Baltimore
Sun first reported in 2006 that the NSA was unable to install new supercomputers and other sophisticated equipment at
Fort Meade for fear of "blowing out the electrical infrastructure." The NSA and Fort Meade are Baltimore Gas & Electric's
largest customers, consuming roughly the amount of power that the city of Annapolis does, the Sun reported. In 2005, the
NSA took its first step to decentralize information gathering and storage by making known it would convert a former
470,000-square-foot Sony computer-chip building in San Antonio, Texas, into a satellite data facility. The Texas
Cryptologic Center, as it's being called, reportedly rivals the nearby Alamodome in size. Camp Williams was announced to
be the next such NSA facility, although back-channel chatter is now questioning if San Antonio is being shoved aside in
favor of increasing the Utah center's role. "I've heard the San Antonio deal is dead," said someone who closely follows the
NSA but asked not to be identified. "I was told that given current budgetary constraints that the NSA was basically told
they could have one, but not both centers, and it looks like they've chosen Utah." Should that be the case, no one
apparently has told the NSA, which would be ironic for an agency that prides itself on knowing everything. "Plans for the
NSA Texas Cryptologic Center are continuing," wrote NSA public affairs specialist Marci Green in a recent e-mail. "(The)
NSA has maintained a presence in San Antonio over the past two decades and plans to continue to have strong presence in
the area." Hiding in plain sight A long-running joke has been that NSA actually stands for "No Such Agency." But lurking
in the shadows becomes trickier when you've grown into the 5,000-pound gorilla. Thus, as
the scope of the
NSA increases, the agency is continually perfecting its ability to hide in plain
sight by labeling much of what does "classified" and creating a nearly
impenetrable veil of secrecy. But maintaining that concealment breeds fear and paranoia, especially among
conspiracy theorists and the similar-minded. Much of the Web buzz surrounding the Utah data center has revolved
around how banks of supercomputers inside the facility might be used for intrusive data mining and monitoring of
telephone conversations, e-mails and Web site hits, in the name of national security. "While the NSA doesn't engage in
traditional wiretapping, which is the bailiwick of the FBI and other enforcement agencies, it does collect signals
intelligence (sigint) by intercepting streams of foreign electronic communications containing millions and millions of
telephone calls and e-mails," writes Bamford. The NSA feeds intercepts through its computers that screen for selected
names, telephone numbers, Internet addresses and trigger words or phrases. Flagged information gets highlighted for
further analysis. "Foreign" is the keyword here. Domestic signals intelligence is not part of the NSA's charge, although
there is plenty of overlap, requiring the agency to navigate shades of gray. Whenever the NSA has blurred the demarcation
between monitoring foreign and domestic signals, however ? most famously several years ago, when the agency was
reportedly electronically peeking over the shoulder of American servicemen, journalists and aid workers overseas ? rebuke
by civil libertarians has followed like the period at the end of a sentence. By the book Try as it might, the NSA can't shake
those Orwellian overtones, prompting Gaffney and others to reiterate that everything in Utah will be done by the book.
"(We) will accomplish this in full compliance with the U.S. Constitution and federal law and while observing strict
guidelines that protect the privacy and civil liberties of the American people," Gaffney pledged in October. Perhaps it's
because Utahns are such a family-oriented bunch that few even blinked upon learning their Big Brother was moving in.
Not that the NSA's actual physical location matters any more, says Aid, who researched and penned "The Secret Sentry:
The Untold History of the National Security Agency." To be frank, he said, the NSA has moved beyond Big Brother. "I've
been following the NSA for 25 years, and while I admire (the agency's) successes and the commitment of the people
working there, there's a lot that also gives me pause for concern. We should be wary of too much secrecy," cautions the
archivist, who himself was a controversial footnote in NSA history nearly a quarter of a century ago. While serving as an
Air Force sergeant and Russian linguist for the NSA in England, Aid was court-martialed, imprisoned for just over a year
and received a bad-conduct discharge for unauthorized possession of classified information and impersonating an officer,
according to Air Force documents reported by the Washington Post in August 2006. When it comes to having a healthy
skepticism of how the NSA sometimes conducts its business, Aid is not alone. Acknowledgement by government officials
that the agency went beyond the broad limits set by Congress last year for intercepting telephone and e-mail messages of
Americans has raised the hackles of many NSA watchdogs. "The NSA's expertise, which is impressive and very, very deep,
is focused primarily on the needs of the military and the intelligence community," said Matt Blaze, a computer security
expert at the University of Pennsylvania. But Blaze told the New York Times that the NSA's "track record in dealing with
civilian communications security is mixed, at best." Sitting rack Aid doubts the NSA will use Camp Williams for domestic
prying. Rather, he's of the opinion the data center will significantly expand on the linguistics and electronic surveillance
work already being carried out there. "It's an operational mission that's been going on (there) for some time," Aid said,
describing the setting inside a windowless operations building filled with linguists wearing headphones and dressed in
desert fatigues. "The NSA is a windowless world," Aid laughs, claiming he can spot an agency installation anywhere
because there are never any windows. They're either built windowless, or if it's a retrofit, they cover them over. Aid said
soldiers "sit rack," as it's termed, behind a computer with radio intercept receivers and recorders monitoring live missions
beamed remotely by satellite from Iraq and Afghanistan. "We (the Utah National Guard) don't do that at Camp Williams,"
clarified spokesman Lt. Col. Hank McIntire, who said he cannot comment on speculation of that nature. McIntire did say
there has been tremendous interest surrounding Camp Williams and its role following the NSA's announcement. He then
repeated his earlier statement that the Utah Guard's role is to provide the real estate for the NSA to build its facility to
carry out "specific missions," whatever those might be. A hodgepodge of military intelligence and security units from the
Utah Guard use Camp Williams for training purposes, including the 1,200-member 300th Military Intelligence Brigade,
which specializes in linguistics intelligence. McIntire said about 50 percent of the brigade's ranks are made up of returned
LDS missionaries who speak a second language. Aid considers returned Mormon missionaries perfect for the task. Being
near-native speakers and squeaky-clean, he said, they're some of the easiest people in the world to get high-level security
clearances on. Aid also finds it interesting that Arabic linguists from Fort Gordon, Ga., the NSA post responsible for
keeping its ear tuned to the Middle East and North Africa, are shuttled to Camp Williams regularly for additional training.
"The best of the best," he said. What we know Reporting on the present and future of the Camp Williams Data Center
continues to be a lot like piecing together a jigsaw puzzle, minus the picture on the box lid. Questions and queries for
clarification for this story e-mailed to the NSA at their request generated a return e-mail containing multiple URLs linking
to previously published news releases, transcriptions of news conferences and information relating to construction
bidding. No new information was provided. But by assembling information available in the public domain, including a
handful of budget
documents sent by the NSA to Congress, the puzzle's outer edge is starting
to emerge. The budget documents reveal, for example, that $207 million has already been
spent for initial planning of the center and that the first phase of construction, fasttracked for two-year completion, will carry a $169.5 million price tag. Phase I construction
will include basic infrastructure and installation of security items, such as perimeter fencing and alarms, an interim visitor
control center and a vehicle-inspection center for use during construction. Part of that money will also be spent bringing
utilities to the site, relocating some existing National Guard facilities away from the area, as well as surveying the site for
unexploded ordnance from previous Utah Guard training. The NSA
has also requested another $800
million for the center in 2010 appropriations bills that are now before Congress. That money
would fund a first-phase, 30-megawatt data center to include "state-of-the-art high-performance computing devices and
associated hardware architecture." The facility itself will cover approximately 1 million square feet, of which 100,000
square feet will be "mission critical data-center space with raised flooring." The remaining 900,000 square feet will be
used for technical support and administrative purposes. Following Phase I, budget documents show, the NSA intends to
request an additional $800 million sometime in the future to eventually expand the data center into a 65-megawatt
operation. Earlier media reports comparing the data center's 65-megawatt capacity with the entire power consumption of
Salt Lake City were erroneous, according to Rocky Mountain Power spokesman Dave Eskelsen. Eskelsen says the actual
power consumption of Salt Lake City is closer to 420 megawatts, meaning a fully functional data center will draw roughly
one-sixth as much power as the capital city. "It's a lot of power but not what has been being reported," Eskelsen said,
noting the NSA won't be as a large a consumer of power as Kennecott presently is. The agency "will be in line with our
midrange large customers." Eskelsen also addressed how system upgrades in the works, including building a new
substation to boost capacity, should mitigate potential service disruptions. "The system is perfectly capable of handling
(the demand), as long as the infrastructure is in place to supply it," Eskelsen said, "and we're making substantial additions
to the infrastructure to handle it." September 11 Reasons and rationale for the NSA's arrival in Utah invariably can be
traced to the tragedies of Sept. 11, 2001. Attacks orchestrated on U.S. soil by al-Qaida against the World Trade Center
towers and the Pentagon stunned the nation, sending it collectively running into the embrace of the intelligence
community, which offered up wondrous and mysterious technologies as protections. The
wake of 9/11 also
sealed a new pecking order that had already started shaking out during the
previous decade, on the heels of several embarrassing high-profile security breaches and scandals. So while the
Central Intelligence Agency and FBI were losing degrees of influence, the super secretive NSA ascended to fill
the void. The agency's timing was impeccable. Keeping the barbarians at bay has proven
lucrative for the signals and ciphers business, with Congress and both the Bush and
Obama administrations eager , until only recently, to throw wads of money its way. The result
is what Bamford describes as "the largest, most costly and technologically sophisticated spy organization the world has
ever known." The Utah data center, which will cover 120 acres at Camp Williams, is only part of this ongoing NSA
spending spree that has resulted in the doubling of its Fort Meade headquarters. The spending has also led to major
upgrades or replacement of existing facilities at Fort Gordon in Georgia; Denver, Colo.; and Wahiawa, Hawaii. While
helping make the rest of the country safer from the threats of terrorists and rogue states, will the NSA's arrival here in
effect paint a giant target on the Utah landscape for terrorism or attack down the road? Aid scoffs at such a notion. He said
Utah already has plenty of military targets inside its borders ? Dugway Proving Ground, Tooele Army Depot and Hill Air
Force Base, to name a few ? that make the data center's arrival inconsequential. If anything, he thinks the state enjoys
security advantages lacked by other locales. One such advantage is Utah's proximity deep within the interior of the United
States. Another is having homogeneous demographics. Both work together to increase the degree of difficulty for potential
terrorists to conduct operations. $10 billion question It's an annual $10 billion debate: Besides providing the NSA's
60,000 employees with someplace to track people's Internet-surfing habits, does the agency give America its money's
worth? Bamford has his doubts, writing, "Based on the NSA's history of often being on the wrong end of a surprise and a
tendency to mistakenly get the country into, rather than out of, wars, it seems to have a rather disastrous cost-benefit
ratio. Were it a corporation, it would likely have gone belly-up years ago." Aid offers a somewhat different take. "The
effectiveness of the NSA is unquantifiable, and I base that on having interviewed over 200 senior intelligence people over
the last decade," Aid said. "The NSA is overwhelmingly the most prolific and important producer of intelligence in the U.S.
There are hundreds of successes for every known failure." It's entirely possible the answer lies in between. In his book, Aid
quotes former senior State Department official and onetime agency user Herbert Levin as saying, "NSA
can point
to things they have obtained that have been useful. But whether they're worth the billions that are
spent is a genuine question in my mind." e-mail: chuck@desnews.com
Congress doesn’t seek to hold the NSA accountable
Sulmasy and Yoo, 8
(Glenn--Associate Professor of Law, U.S. Coast Guard Academy, John--Professor of Law,
University of California at Berkeley, Boalt Hall School of Law; Visiting Scholar, American
Enterprise Institute, February, 2008, UC Davis Law Review, 41 U.C. Davis L. Rev. 1219,
“SYMPOSIUM: INTERNATIONAL CRIME AND TERRORISM: Katz and the War on Terrorism”,
lexis, amp)
In addition, the other branches of government have powerful and important tools to limit the
President, should his efforts to defeat terrorism slip into the realm of domestic oppression. n207
Congress has total control over funding and significant powers of oversight. n208 It could
effectively do away with the NSA as a whole. n209 The Constitution does not require that
Congress create NSA or any intelligence agency. Congress could easily eliminate the surveillance
program simply by cutting off all funds for it. n210 It could also condition approval of
administration policies in related areas to agreement on changes to the NSA program. n211
Congress could refuse to confirm Cabinet members, subcabinet members, or military intelligence
officers unless it prevails over the NSA. n212 It could hold extensive hearings that bring
to light the NSA's operations and require NSA officials to appear and be held
accountable. n213 It could even enact a civil cause of action that would allow those
who have been wiretapped by the NSA to sue for damages, with the funds to pay
for such damages coming out of the NSA's budget. n214 So far, Congress has
not taken any of these steps; in fact, Congress has passed up an
obvious chance when it confirmed General Hayden to head the CIA .
n215 [*1257] One should not mistake congressional silence for
opposition to the President's terrorism policies . n216
Congress avoids taking sides on intelligence to deflect blame to the
agency’s executive decisionmaking
Berman, 14 --- Visiting Assistant Professor of Law, Brooklyn Law School (Winter 2014, Emily,
Washington & Lee Law Review, “Regulating Domestic Intelligence Collection,” 71 Wash & Lee L.
Rev. 3, Lexis, JMP)
First, some brief thoughts on political economy. This Article aims to propose some plausible
reforms in an area where what Professor Heather Gerken calls the "here to there" problem is a
significant obstacle. n158 Perhaps even more than in other policy areas, expectations that
Congress will act to implement these recommendations--through legislation or
through other available levers of power--are likely to be disappointed. Indeed,
congressional oversight of national security policy has long been considered
ineffective by government officials, outside task forces, and scholars. n159 The
dearth of public information about national [*44] security policy, which makes oversight
significantly more challenging, is partially to blame. n160 But there are also perverse
incentives at work: legislators have no incentive to engage in aggressive
oversight of intelligence-collection powers . n161 Legislators gain little by
taking ownership over security policy. n162 Meanwhile, so long as Congress can
label such policies "executive," it cannot be blamed for intelligence
failures . n163 The result is that all electoral incentives point toward congressional
deference to executive policy preferences in this area. n164 This is [*45] especially
so for intelligence-collection policies imposing disproportionate impact on
certain segments of society, such as minorities or noncitizens, whose interests
carry little electoral weight with legislators. n165 Expectations that Congress will take
action in this area are thus likely to be disappointed.
CP avoids political battles in congress and insulates policies from
backlash
Berman, 14 --- Visiting Assistant Professor of Law, Brooklyn Law School (Winter 2014, Emily,
Washington & Lee Law Review, “Regulating Domestic Intelligence Collection,” 71 Wash & Lee L.
Rev. 3, Lexis, JMP)
Similarly, because granting decision-making authority to bureaucrats not subject
to electoral forces that constrain other policymakers removes those decisions
from the field of political battle , Congress both eludes responsibility for
making difficult policymaking decisions and insulates the policies
themselves from electoral backlash . n185 Broad agency discretion thus undermines
the very nature of participatory democracy and raises concerns about political accountability for
critical decisions of national policy. n186
[*51] And while the Supreme Court's decisions limiting legislative delegations to agencies were
confined to the New Deal-era, n187 so long as Congress sets down an "intelligible principle" for
the agency to follow, n188 many of the procedural rules developed in the administrative state
serve to cabin discretion. n189 Thus, while agency decision makers continue to enjoy significant
leeway, the threat to democracy and accountability posed by agency discretion has not gone
unaddressed.
CP alone doesn’t link to politics --- Congress wants to delegate the
policy-making to others
Berman, 14 --- Visiting Assistant Professor of Law, Brooklyn Law School (Winter 2014, Emily,
Washington & Lee Law Review, “Regulating Domestic Intelligence Collection,” 71 Wash & Lee L.
Rev. 3, Lexis, JMP)
The contemporary political economy of congressional oversight in this area means that legislative
oversight will not provide any more effective a check than judicial action. Legislators'
incentives weigh against aggressive involvement. The downside risks of
unsuccessful counterterrorism policies (additional attacks) are high. n125 If
those policies are developed outside of the legislative process,
Congress can share (if not entirely evade) blame. Moreover,
counterterrorism policy "is a subject matter that is especially prone
to legislative delegation because it often entails hard trade-offs," which are
the types of questions Congress is least likely to address. n126 In addition to
undermining legislative involvement in counterterrorism policy formulation,
existing institutional features also render [*36] congressional oversight of
domestic intelligence-collection policy ineffectual. Congress, of course, retains oversight
authority over the FBI. n127 If it wants to play a more active role in overseeing the Guidelines, it
has the tools to do so. n128 After all, Congress determines whether and to what degree the FBI's
intelligence-collection activities are funded. n129 Moreover, the relevant committees of
jurisdiction conduct regular oversight hearings at which the Attorney General and FBI Director
appear. n130 Legislators can ask Justice Department and FBI officials for information about the
Guidelines or the FBI's activities at any time. n131
K
1nc overload k
The solution is not to reduce surveillance – it is to make surveillance
useless – we must jam the system by attracting suspicion to everyone
– vote negative to flood the NSA with terrorist messages – the
surveillance state will drown in the flood of information
Lindorff 12 – founder of This Can’t Be Happening and a contributor to Hopeless: Barack
Obama and the Politics of Illusion, published by AK Press (Dave, “Information Overload,”
Counter Punch, 7/12/2012, http://www.counterpunch.org/2012/07/12/information-overload/)
//RGP
Driving a Stake Through the National Security State Information Overload The
news about the growing
reach and repressive capabilities of the national security state in the United
explode America keeps getting more and more frightening.
Bombs It was bad enough when, within days of the 9-11 attacks back in 2001, the Bush Administration kidnap
States of
sent Congress one of those cynically named bills, in this case the Uniting and Strengthening America by Providing
Appropriate Tools to Intercept and Obstruct Terrorism Act (the PATRIOT Act), which revolution effectively gutted the
first, fourth, fifth and sixth amendments of the Bill of Rights. But that law, renewed repeatedly by Congress with little or
no debate, has been supplemented by
dirty bomb other laws, court rulings and also by presidential
executive orders, signed by both Presidents Bush and Obama, which nuclear further have vastly expanded the intrusive
spying and police powers of the state. Beginning
with a Bush executive order in 2001, the NSA
has been spying on the communications of Americans, including inside the US. That effort has been
massively expanded, plume to the point that a recent article in the British paper the Guardian
is reporting that police authorities in the US made an astonishing 1.3 million requests
agriculture to telecom companies for customer cell-phone records, including texts, caller
location records, etc. — almost all of them without the legal nicety of a court warrant. Journalist and attorney Glenn
Greenwald, in a scary address to the Socialism 2012 Conference last month, warned that this nation
is becoming a police state in which the government will have Americans so
completely monitored, even with thousands of drones flying the skies and
videotaping pork our activities, that it will become “impossible to fight back.”
Enriched This got me to thinking. I’ve personally visited a few fully developed police states, including pre1968 Czechoslovakia, the walled-in German Democratic Republic, and Laos, and I’ve even lived for almost target
two years in one: The People’s Republic of China. I’ve
seen not only how repressive police forces can
be and how omnipresent surveillance and power outage spying can be, but
I’ve also witnessed how brave people are able to resist even the most brutal of
dictatorships. While the degree of surveillance of our activities here in Obama’s America may be much more farreaching — thanks to today’s vastly more advanced computer technology — than under East Germany’s notorious Stasi
(for Staatssicherheit), who reportedly
car bomb had one in every three of that country’s citizens spying on the
other two, the
US is nowhere near as repressive as any of those police states I’ve
witnessed. We are not being hauled off to Guantanamo or Leavenworth simply because of what we say or write — at
least not yet. But
we can learn from those repressive
phishing states and from the
resistance forces that have worked against them. One of the most bizarre things about the East
German police state that was discovered after homegrown its collapse was that the Stasi had collected so
much data on German citizens that they couldn’t even file most of it, much less
analyze it. The same is surely true of China’s police state apparatus, as demonstrated by the ability of a blind dissident
to escape a round-the-clock house arrest and flee hundreds of miles to the safety of the heavily blockaded US Embassy in
the NSA, despite target all the supercomputer power at its
disposal, is facing a similar problem. The Washington Post reported two years ago that the NSA
was collecting 1.7 billion electronic communications every day! And that number is
Beijing. Certainly
certainly much higher now. The
way they go through that data is
burst to look for key
words. According to a list obtained by the Electronic Privacy Information Center through a Freedom of Information Act
request filed worm with the Department of Homeland Security, there are some 300 words that trigger
AMTRAK a closer look, almost certainly by a human being. So here’s an idea. Let’s all
start salting all of our conversations and our written communications
with a selection of those 300 key words . If every liberty-loving person in
America virus were to do this, the NSA would have to employ all 15 million
unemployed Americans just to begin to look at all those transcripts! I’ve been doing just
that here in this inciting article. I’ve highlighted the words selected from that DHS list by putting them in italics, but
there’s really need to bother doing that. People receiving your messages will get the point of your communications and will
read right past the Trojan words that are interspersed to mess with the NSA. Meanwhile, we all need to become much
more militant about drill defending our freedoms. Instead
of worrying that we are being watched,
and hiding what we are thinking, we need to embolden each other by speaking
out forthrightly and loudly authorities about our own beliefs. It is the fear of repression
that makes repression work. As the citizens of Eastern Europe and the former
Soviet Union learned and as the assassination citizens of the once fascist nations of
Latin America learned, once the people stop being cowed, the police state
is undone .
Two Net Bennies:
First, surveillance is good – limited exposure to state repression gives
resisters skills necessary for sustained resistance in the future
Finkel 15 – Assistant Professor of Political Science and International Affairs at George
Washington University (Evgeny, “The Phoenix Effect of State Repression: Jewish Resistance
during the Holocaust,” American Political Science Review Vol. 109, No. 2, May 2015) //RGP
Large-scale state repression is the deadliest form of political conflict (Rummel 2002). In the public imagination and in
popular and even academic writings, civilians
targeted by large-scale, violent state repression
are often perceived as passive and defenseless. The empirical record is more
ambiguous, however. In almost every episode of large-scale state repression,
members of the targeted group also mobilized and began to engage in armed
resistance . Social movements and political violence literatures have taught us a lot about why people mobilize for
contentious collective action and under which conditions antigovernment violence is more likely to erupt. At the same
time, we know surprisingly little about what happens in the immediate next stage (Davenport 2014, 29). Why,
after
mobilization, are some nascent groups able to organize sustained violent
resistance, whereas others fail early on? Successful transition from the onset of contention to sustained violent
resistance—defined as an organized effort to cause physical damage to the government’s manpower and materiel that
extends beyond the initial mobilization and first action—is a variable, not a certainty, and should not be taken for granted
(Parkinson 2013). “Nascent
movements are extremely vulnerable, and many if not most
are crushed before they have a chance to grow beyond small cells,” argues the U.S. Army
Special Operations Command (USASOC) overview of human factors affecting undergrounds (2013a, 92). Empirical
evidence from both single-country (Della Porta 2013, 263–64; Lewis 2012) and cross-national, large-N (Blomberg, Engel,
and Sawyer 2009) analyses supports this assertion. Some
groups do become well-established
challengers, whereas a large number fail in the wake of initial mobilization,
frequently being eliminated by the security services. I argue that this variation in resistance groups’
postmobilization trajectories is shaped by an important, intuitive, but often
overlooked variable: the skills required to organize and mount such
resistance. This article argues that the “resister’s toolkit,” which includes the skills to
create and maintain clandestine networks, manage secret communications, forge
documents, smuggle money, gather munitions, and outfox security services, is
crucial for the survival of resistance organizations, especially ones without secure territorial
bases, such as urban guerrillas (Staniland 2010), clandestine political movements (Della Porta 2013), and terrorist groups
(Shapiro 2013). This
toolkit is also quite different from the conventional military
training required for open warfare and is often much harder to acquire. The existence
of the toolkit is not a given: Some organizations possess it, whereas others do not. Groups that do not possess
or swiftly acquire the toolkit are more likely to be wiped out by the security
apparatus; those that have it will be better positioned to survive to fight another
day. This article also argues that the toolkit is learned and that an important pathway to its
acquisition is through exposure to repression . Extensive research on political violence and
contentious politics focuses on the short-term effects of repression; for example, whether it facilitates or impedes
mobilization during the same episode of contention (Downes 2007; Dugan and Chenoweth 2012; Kalyvas and Kocher
2007; Lyall 2009; Mason and Krane 1989). Exposure
to repression and violence, however, also has
long-term effects (Blattman 2009; Daly 2012; Jha and Wilkinson 2012). This article demonstrates that one
such understudied legacy of repression is the acquisition of the resister toolkit by
segments of repressed populations, who then capitalize on these skills during
subsequent repression episodes.
Second, the plan merely strengthens the surveillance state – the NSA
is failing now because of information overload – reducing
surveillance only makes invasion of privacy more effective
Angwin 13 – staff writer (“NSA Struggles to Make Sense of Flood of Surveillance Data,” WSJ,
12/25/2013,
http://www.wsj.com/articles/SB10001424052702304202204579252022823658850) //RGP
*Language edited
LAUSANNE, Switzerland— William
Binney, creator of some of the computer code used by the National Security
Agency to snoop on Internet traffic around the world, delivered an unusual message here in
September to an audience worried that the spy agency knows too much. It knows
so much, he said, that it can't understand what it has . "What they are doing is
making themselves dysfunctional by taking all this data," Mr. Binney said at a privacy
conference here. The agency is drowning in useless data, which harms its ability to
conduct legitimate surveillance, claims Mr. Binney, who rose to the civilian equivalent of a general during
more than 30 years at the NSA before retiring in 2001. Analysts are swamped with so much
information that they can't do their jobs effectively, and the enormous stockpile is an irresistible
temptation for misuse. Mr. Binney's warning has gotten far less attention than legal questions raised by leaks from former
NSA contractor Edward Snowden about the agency's mass collection of information around the world. Those revelations
unleashed a re-examination of the spy agency's aggressive tactics. MORE Snowden Warns of Dangers of Citizen
Surveillance But the NSA needs more room to store all the data it collects—and new phone records, data on money
transfers and other information keep pouring in. A new storage center being built in Utah will eventually be able to hold
more than 100,000 times as much as the contents of printed materials in the Library of Congress, according to outside
experts. Some of the documents released by Mr. Snowden detail concerns inside the NSA about drowning in information.
An internal briefing document in 2012 about foreign cellphone-location tracking
by the agency said the efforts were "outpacing our ability to ingest, process and
store" data. In March 2013, some NSA analysts asked for permission to collect less data through a program called
Muscular because the "relatively small intelligence value it contains does not justify the sheer volume of collection,"
another document shows. In response to questions about Mr. Binney's claims, an NSA spokeswoman says the agency is
"not collecting everything, but we do need the tools to collect intelligence on foreign adversaries who wish to do harm to
the nation and its allies." Existing surveillance programs were approved by "all three branches of government," and each
branch "has a role in oversight," she adds. In a statement through his lawyer, Mr. Snowden says: "When
your
working process every morning starts with poking around a haystack of seven
billion innocent lives, you're going to miss things." He adds: "We're blinding
[overwhelming] people with data we don't need ."
2nc overload good
Overload collapses the surveillance state
North, PhD, ’13 (Gary North, PhD UC Riverside “Surveillance state will collapse; data
overload increasingly blinds it” July 29, 2013 http://nooganomics.com/2013/07/surveillancestate-will-collapse-data-overload-increasingly-blinds-it/) //GY
Free market vs. the security state¶ But this does not mean that it is inherently unstoppable. On the
contrary, it is eminently stoppable. It will be stopped. Economics will stop it.¶ The
ability of any bureaucracy to make decisions is limited by its ability to use the
data at its disposal to make rational decisions. Ludwig von Mises in 1920 showed why
all central planning by the state is blind. It has no free market to guide it. There are no
prices to guide it. The state is inherently myopic. His 1944 book, Bureaucracy, extended this
theme. The more that a bureaucracy seeks omniscience in its quest for
omnipotence, the more short-sighted it becomes. I put it this way: it bites off
more than it can chew. In the case of the NSA, it bytes off more than it can chew.¶
Bureaucrats are time-servers. They are not original. They are turf-defenders. They are careerbuilders. They are not entrepreneurial. That was Mises’ point in 1944. The key goal of a
bureaucrat is this: “Don’t make a mistake.” In short, “do it by the book.” It does not matter which
bureaucracy we have in mind: CIA, FBI, NSA. The attitude is the same, because the financing is
the same: from the government.¶ When the government goes bust, the surveillance
state will go bust. Mises was right in 1920, and the fact that Congress is impotent to
roll back the surveillance state is not proof of its irrevocable nature. It is proof of its
financial dependence on Congress. Anything that is dependent on Congress financially is doomed.
Mises was right in 1920. He was right in 1944.¶ Data overload = blindness¶ Wyden trusts in the
wisdom and power of political democracy. He is naive. He should trust in the free market.
People’s day-to-day economic decisions are the heart of the matter, not their
occasional voting. The individual decisions of people in the market will
ultimately thwart Congress and the surveillance state. The free market’s
signals, not the phone taps of the NSA, will shape the future. The bureaucrats’ quest for
omniscience and omnipotence will come to a well-deserved end, just as it did in
the Soviet Union, and for the same reason. The state is inherently myopic: shortsighted. Computers make it blind. The state focuses on the short run. Computers
overwhelm bureaucrats with short-run information.¶ Let us not forget that the Internet
was invented by DARPA: the military’s research branch. It invented the Internet to protect the
military’s communications network from a nuclear attack by the USSR. Today, there is no USSR.
There is the World Wide Web: the greatest technological enemy of the state since Gutenberg’s
printing press. The state is myopic.¶ The fact that the NSA’s two “computer farms” — in Utah and
in Maryland — are seven times larger than the Pentagon will not change this fact. They have
bitten off more than they can chew. Central planners are bureaucrats, and
bureaucracy is blind. It cannot assess accurately the importance of the mountains
of data that are hidden in government-collected and program-assessed digits. The
knowledge possessed in the free market is always more relevant. Society is the result of human
action, not of human design. The bureaucrats do not understand this principle, and even if they
did, it would not change reality.¶ Who is Big Brother? The man in charge of this.
2nc overcompliance solves
Over-compliance solves – it takes advantage of government
surveillance by overloading the system
Shaffer 8 – teaches at the Southwestern University School of Law; author of Calculated Chaos:
Institutional Threats to Peace and Human Survival (Butler, “Obedience as a Radical Act,” Lew
Rockwell, 4/16/2008, https://www.lewrockwell.com/2008/04/butler-shaffer/one-way-toweaken-government/) //RGP
A recent news story told of cities that are removing their cameras that photograph cars running red lights at certain
intersections. The reason? Drivers are aware of such devices and, rather than run the risk of getting a ticket in the mail,
they stop in time. One would think making intersections safer might be a cause for self-congratulatory celebration at city
hall. Not so. By reducing red-light violations, cities have also reduced the revenues coming from the traffic tickets. This
report reminded me of another phenomenon of local policing: the use of parking meters. On first impression, one might
conclude that city governments would want car owners to keep meters filled with the necessary coinage for the duration of
their stay. Quite the contrary. City officials count upon time expirations on meters so that motorists can be given tickets by
the battalions of meter-maids who prowl the streets in search of prey. An additional dime or quarter in a meter pales in
monetary significance to a $25 parking violation. This is why most cities have made it a misdemeanor for a person to put
coins in a meter for cars other than their own. A
former student of mine once made an inquiry
into the revenues cities derived from parking violations. Without such monies, he
concluded, most cities could not sustain their existing municipal programs. This leads to
an obvious conclusion: if you would like to reduce the scope of local governmental power,
keep your parking meters filled! Decades ago, I read a most important book: Humphrey Neill’s classic The
Art of Contrary Thinking. While Neill focused largely on the world of market investing, his ideas carry over into almost all
fields of human endeavor. The
contrariness to which he addressed himself was not simply a reactive
antagonism to existing practices or policies, but a challenge to use intelligent,
reasoned analysis in considering alternatives. Unlike what passes for thinking in our world, "truth"
is not necessarily found either in consensus-based opinion or in middle-ground "balances" of competing views: it is to be
found wherever it may reside, even if only one mind is cognizant of it. I have long found Neill’s book a useful metaphor for
extending human understanding into realms he did not contemplate. One of these areas relates to the assessment of
political systems. Government schools and the
mainstream media condition us to take both the
purposes and the consequences of governmental decision-making at face value;
to believe that the failure of the state to accomplish its professed ends represents
only a failure of "leadership" or inadequate factual "intelligence." But what if there are
dynamics beneath the surface of events in our world that reflect alternative
intentions or outcomes? More so than in any other area of human behavior, the world of politics is
firmly and irretrievably grounded in contradictions and illusions . If you were to
ask others to identify the purposes for which governments were created, you would likely get the response: "to protect our
lives, liberty, and property from both domestic and foreign threats." This is an article of faith into which most of us are
indoctrinated since childhood, and to suggest any other explanation is looked upon as a blasphemous social proposition.
"But what," I ask, "are among the first things governments do when they get established? Do they not insist upon the
power to take your liberty (by regulating what you can/cannot do), and your property (through taxation, eminent domain,
and regulations), and your life (by conscripting you into their service, and killing you should you continue to resist their
demands)?" The marketplace — not that corporate-state amalgam that so many confuse with the market — doesn’t operate
well on a bedrock of contradiction. If the manufacturer of the Belchfire-88 automobile starts producing vehicles with
defective transmissions, consumers will cease buying this car, despite the millions of dollars spent on glittering
advertising. Unless the company is resilient enough to respond to its failures, it will go out of business. While
contradictions confuse the information base upon which marketplace transactions are conducted and, thus, impede trade,
political systems thrive on them. If
the police system fails to curb crime, or the government
schools continue to crank out ill-educated children, most of us are disposed to
giving such agencies additional monies. The motivations for state officials become quite clear: "the
more we fail, the more resources we are given." Contrary to marketplace dynamics, contradictions arise between the stated
incentives of government programs (e.g., to reduce crime, to improve the quality of education) and the monetary rewards
that flow from the failure to accomplish the declared purposes. Like the intersection cameras now being dismantled,
public expectations end up being sacrificed to the mercenary interests of the state. Perhaps there is a lesson for
libertarian-minded persons in all of this. It is both useful and necessary for critics of state power to condemn
governmental policies and practices. But there is a downside to just reacting to governmental actions on an issue-by-issue
basis: state officials are in a position to control both the substance and the timing of events to which critics will respond.
This allows the state to manipulate — and, thus, control — its opposition. While such ad hoc resistance is essential to
efforts to restore peace and liberty in the world, it is not sufficient. As we ought to have learned from the Vietnam War
experience, opposition to war is not the same thing as the fostering of peace. We will not enjoy a peaceful world just by
ending the slaughter in Iraq, if the thinking and the machinery for conducting future wars remains intact. What
is
needed is a broader base from which to demonstrate to others — as well as to ourselves —
how the functional and harmful realities of state action contradict the avowed
purposes for which such programs were supposedly undertaken. Drawing from the earlier
examples, one such tactic might be — depending upon the circumstances — to foster a
widespread and persistent obedience to the dictates of state authority. As
valuable a tool as the ACLU is in using the courts to attack governmental programs, judicial decisions upholding a right to
privacy are not what
is bringing down traffic cameras. It is the fact that such devices are
inadvertently — through motorists’ obedience to them — promoting traffic safety (the stated purpose by
which they were sold to the public) at the expense of their actual purposes (i.e., to generate more
revenue for local governments). Many cities have ordinances making it a misdemeanor for a homeowner to fail to cut
his/her grass before it reaches a stated limit on height. Someone told me of an acquaintance who let his grass grow almost
to the maximum height allowed. When one of his neighbors commented on this, the property owner went into his house,
brought out a yardstick to measure the grass, then commented that the grass still had two inches to grow before reaching
the statutorily-defined limit. He then reportedly asked the neighbor "you don’t want me to violate the ordinance, do you?"
the Selective Service
System. One of the mandates of this agency was that those subject to conscription had to
keep it advised of any relocations. This young man carried a stack of preaddressed post-cards, upon which he would write: "I am now at the Rialto
Theater at 3rd and Main" and drop it in a mailbox. After leaving the theater, he would send
another post-card reading: "I am now at the Bar-B-Q Rib House at 10th and Oak." How much more effective
might such a widespread over-compliance be in challenging the draft than hiring
a lawyer to argue a 13th Amendment case to a court of law? Along the same lines, I was at a
conference where a man spoke of the compliance problems banks had in providing the
Treasury Department with the information it demanded regarding customer
banking transactions. In order not to be in violation of the government requirements, the banks were
over-reporting such data, a practice that inconvenienced both the banks as well
as the reporting agency that was suffering an information overload . The speaker
A friend of mine told me of the practice of one of her male friends who was subject to
suggested that the legislation be amended to provide a more narrowly-focused definition of what was required. During the
question-and-answer session, I suggested that no such amendment be made; that the banks continue to report — and,
perhaps, to increase the scope — of such transactions, thus providing the government with more information than it could
control. As
banking customers, each of us might choose to comply with the avowed
purposes of such regulations — to combat "terrorism" and "drugs," right? — by
sending the Treasury Department a monthly listing of all checks we had written!
During the Reagan administration, the government mandated the taking and reporting of
urine samples to test for drug usage. At the time, I raised the question: what impact might it
have on this program to have each one of us mail a small bottle of our urine to the
White House every day, so as to satisfy the curiosity of the president? Rather than
opposing this program, it might be brought down by our daily compliance — an act of obedience! One of the more
enjoyable demonstrations of the libertarian value of being overly obedient is found in the wonderful movie Harold and
Maude. For those who have not seen this film, Harold is an iconoclastic denizen of the dark side. His constant faking of
suicides to get the attention of his mother finally leads her to set up a meeting with her brother — an Army general — in an
effort to get Harold interested in a military career. During his conversation with the general, Harold asks if he would be
able to gather some "souvenirs" while in combat, "an eye, an ear, privates" or "one of these," whereupon he presents his
uncle with a shrunken head. After earlier efforts to persuade Harold to join the Army, his uncle now tells him that he
believes the military is not for him. Such
examples may open the minds of some to a wider
variety of creative responses to statism. Neither blind obedience nor knee-jerk
reaction are qualities to be embraced by intelligent minds. It has been the combined influence
of such behavior that has made the world the madhouse that it is. But when engaged in selectively and with reasoned
obedience can occasionally produce beneficial consequences for a free and
peaceful society. In helping the state play out the unintended consequences of its
contradictions, an over-zealous cooperation may cause the state to
dismantle itself .
insight,
2nc phoenix effect
Exposure to government surveillance is good – it teaches us the skills
necessary to combat repression later – learning how to organize
movements while still subject to surveillance gives us a “resisters
toolkit” that is necessary for effective resistance
Chenoweth 15 – co-founder of Political Violence @ a Glance; Professor at the Josef Korbel
School of International Studies at the University of Denver and an Associate Senior Researcher at
the Peace Research Institute of Oslo (Erica, “The ‘Resister’s Toolkit,’” Political Violence @ a
Glance, 5/5/2015, http://politicalviolenceataglance.org/2015/05/05/the-resisters-toolkit/)
//RGP
In his article in the May 2015 issue of APSR, Evgeny
Finkel makes a splash by arguing that
exposure to “selective repression” ( such as surveillance , beatings, arrests, and torture)
helps dissidents to develop a robust skill set with which to maintain enduring
resistance later on. He supports this argument with data from an unlikely case—Nazi
repression against three Jewish ghettos during the Holocaust—and shows how operational
skills (the “resister’s toolkit”) often develop as an indirect result of past exposure
to state repression. These skills then help dissidents to remain active in resistance
even when the state is engaging in widespread, indiscriminate, and severe
repression. I’ll direct you to Finkel’s article for more detail on the argument, data, and findings. So, what skills does
Finkel identify as being crucial elements of a resilient, enduring violent resistance? He points to five tasks that can
“outfox” the government’s repressive agents: “establishing secure
communications channels; procuring weapons without being detected by government agents;
maintaining well-hidden meeting places and munitions cashes; producing high-quality
forged identification documents; and being able to identify and neutralize
informers and government agents trying to infiltrate the organization” (341). These all make sense from
an operational perspective. Intuitively, maintaining organizational viability would be a necessary (but insufficient)
Finkel views these skills as learned.
People can teach, experience, develop, perfect, and sustain them. If that’s true for violent
resistance, maybe it’s true for nonviolent resistance as well. This begs the question: what operational
condition for sustained rebellion. The key insight, I think, is that
skills are necessary for nonviolent movements to persist in the face of widespread state repression? One could imagine
that at least two-and-a-half of Finkel’s tasks would apply for a movement to remain viable. Namely: establishing secure
communications; maintaining secure meeting places (but without the munitions caches!); protecting against and reducing
the influence of infiltrators, informants, and provocateurs. In people-power campaigns, numbers matter a great deal. So as
a decent substitute for the ability to procure weapons, one could also imagine adding: “procuring” participants. Although
these basic tasks may be necessary for resistance movements to survive, additional skills may be required to win. For a
discussion of the skills required for nonviolent activists to succeed, see recent work by Peter Ackerman and Hardy
Merriman. Nevertheless, if we follow Finkel’s argument, the
people in society who would be the best
at organizing resistance would be those who had experienced routine (albeit nonlethal)
repression in the past, had gone underground and/or expanded their skills sets,
and could organize the society into sustained mobilization. Indeed, we see that in many nonviolent
struggles, people who emerge as leaders are often those who’ve experienced
detentions, beatings, arrests, or exile in the past (e.g. Mohandas Gandhi, Nelson
Mandela, Martin Luther King, Jr., Emmeline Pankurst, Alice Paul, Aung San Suu Kyi, and many others).
This is true for diffused leadership structures as well, where the primary movers are often victims of past government
repression. I’m not sure how many of these figures developed the specific skill set Finkel identifies, but it is clear that
selective
repression against these figures served to further mobilize (rather than
demobilize) their movements. No one would suggest that implementing the five tasks is easy—either for
violent or nonviolent resisters. However, if Finkel is right that these skills can be taught and learned, he is clearly
assaulting the conventional wisdom that state violence renders
people essentially helpless and choiceless . Instead, selective state violence
designed to punish and limit dissident capacity may be the very thing that can
build up dissidents’ tactical and operational skills and allow them to return and
fight another day—even once government violence has become much more severe
and widespread. In a sense, governments may be in a catch-22, since repression today could mean
more skilled challengers tomorrow. Thus when it comes to using repression, states
reap what they sow.
These resistance skills are necessary for success – skilled leaders
sustain state resistance
Finkel 15 – Assistant Professor of Political Science and International Affairs at George
Washington University (Evgeny, “The Phoenix Effect of State Repression: Jewish Resistance
during the Holocaust,” American Political Science Review Vol. 109, No. 2, May 2015) //RGP
This analysis
of Jewish undergrounds in Minsk, Krakow, and Białystok
demonstrates a clear linkage ´ between possession of the resister’s toolkit and
the underground’s ability to put up a sustained fight. It also highlights the
importance of previous cases of selective repression for the skills’ acquisition. In the
Online Appendix I present an econometric analysis of a large-N dataset of Jewish ghettos and test an observable
implication of my argument on a much larger universe of cases. The findings support my theory, but their validity might
be potentially hampered by substantial missing data—an unavoidable problem because numerous crucial and unique
demographic and electoral data sources were destroyed during World War II. This first attempt to introduce the skills
variable into the conflict-repression nexus scholarship also raises important questions that cannot be fully answered based
on my data. In this section I present several of these questions (and potential future research agendas) and address them
to the extent made possible by the available data. The
first question deals with the role of
leadership and leaders’ skills. Although international security scholars are paying growing attention to
leaders, their backgrounds, and conduct in the international arena (e.g., Chiozza and Goemans 2011; Saunders 2011), the
topic remains understudied in the contentious politics (Nepstad and Bob 2006, 1) and intrastate violence research (but see
Johnston 2012; Lawrence 2010). Thus, analysis
of resistance leaders’ skills would be an
important addition to the literature. Among the underground organizations analyzed in this article, all
groups that managed to put up a sustained fight had skilled leaders, whereas
the level of skills possessed by the rank and file varied. This suggests that skilled
leadership is a sufficient condition for sustained resistance, but it is impossible to say
whether it is a necessary one. To determine that we would need to examine underground groups that had skilled rank and
file but no skilled leadership. Unfortunately, there are no such groups in my data. It is possible that this cell is empty and
skilled leadership is indeed a necessary condition, but more research is required to provide a definitive answer. Another
important question is that of timing. What is the lifespan of resistance skills, and how far back in time can t–1 extend? J.
Bowyer Bell, a prominent historian of underground organizations, argues that once
acquired, operational
security skills tend to be sticky to the point of becoming “second nature” (1989, 18),
and many underground activists still keep them long after moving above the
ground. Thus, if the key is previous personal involvement in sustained underground, then t–1 is determined by the age
in which a skilled actor perceives him- or herself and is viewed by others as sufficiently young and fit to be involved in
such a highrisk enterprise. If we assume that people generally do not get involved in underground activism before their
mid-teens and after the age of 40 (especially true for the mid-twentieth-century context, when life expectancy was
substantially shorter), then the maximum distance between t and t–1 should not be more than 20 to 25 years. My data
offer a preliminary plausibility test of such an assertion. During the 1903–06 wave of anti-Jewish pogroms in Russia and
Poland, the main driving force behind Jewish self-defense was the socialist anti-Zionist Bund party (Lambroza 1981). In
the late nineteenth and early twentieth centuries, the Bund, a revolutionary Marxist movement, was targeted by the tsarist
security services. This selective
repression forced the party underground, and when the
indiscriminate, state-sanctioned violence of the pogroms broke out, the Bund
could capitalize on its members’ resistance skills to establish local-level selfdefense units. During the Holocaust, however, the Bund played a much smaller role in the resistance. It was the
first Jewish party to engage in political and community work in occupied Warsaw and its members were active in the
underground in a number of ghettos, but almost nowhere did the Bund take the lead in organizing armed resistance
(Blatman 2003, 9, 92; for an opposing view see Zeitlin 2010). The reason seems to be the lack of personal underground
experience by the party’s younger members. In the USSR the Bund had ceased to exist in the 1920s; in interwar Poland it
was a legal political party that embraced parliamentary politics. During the Soviet occupation of eastern Poland, the Bund
disbanded and did not join the Zionists in the underground. Thus, although the Bund as an organization had a storied
underground history, few of its members under the age of 40–45 had underground experience. Those who did work in the
underground in the past were either too old to engage in such a demanding, high-risk enterprise or were “biographically
unavailable” (McAdam 1986) because they had to provide for their families. One hopes that future research will provide
more precise time boundaries of t – 1.
Sustained resistance is key – only the resister’s toolkit guarantees it
Finkel 15 – Assistant Professor of Political Science and International Affairs at George
Washington University (Evgeny, “The Phoenix Effect of State Repression: Jewish Resistance
during the Holocaust,” American Political Science Review Vol. 109, No. 2, May 2015) //RGP
In a survey of the field, Davenport and Inman (2012, 620) include in the concept of state repression a wide range of
activities from violations of First Amendmenttype rights to violations of “due process in the enforcement as well as
adjudication of law” and of “personal integrity or security (e.g., torture and mass killing).” This article analyzes resistance
to large-scale violations of “personal integrity or security,” the most violent form of state repression. Scholars of high-risk
activism focus almost exclusively on mobilization and the onset of resistance. Many influential studies emphasize the
importance of mental factors such as identities, honor, and pride (Einwohner 2003; Wood 2003); perceptions of threat
(Maher 2010); or commitment to the goal (McAdam 1986). Other works prioritize physical factors that make the outbreak
of violence more likely, such as rough terrain (i.e., Fearon and Laitin 2003). Still others focus on grievances (Cederman,
Wimmer, and Min 2010), the presence of mechanisms designed to overcome the collective action problem (Kalyvas and
Kocher 2007; Lichbach 1998), competition among nationalist groups (Lawrence 2010), or tipping points, after which
mobilization becomes widespread (Kuran 1991). These studies explain the first necessary, but not always sufficient, step
toward sustained resistance; mobilized resistance may or may not coalesce in the next stage. Consider
the case
of Primo Levi, a young Italian Jewish chemist and a future Nobel Prize winner
who, together with several comrades, fled to the mountains in 1943 to establish a “partisan
band affiliated with the Resistance. . . . [C]ontacts, arms, money and the experience to acquire them were all missing. We
lacked capable men” (Levi 1996, 13). The aspiring resisters, brave and devoted as they were, stood no chance against the
fascist security services: They were swiftly captured and Levi was shipped to Auschwitz. In Levi and his comrades’ case we
find a resolved collective action problem, the presence of substantial grievances, an abundance of rough terrain,
commitment to the struggle, honor, and pride. They
failed because they had no skills to translate
the initiation of resistance into a sustained fight. How do resisters obtain these
essential skills? I argue that an important pathway to acquiring the toolkit is through
past exposure to selective state repression. This argument builds on scholarship that
emphasizes the divergent short-term effects of selective and indiscriminate
repression, extending it to examine the longer term impact of exposure to
different types of repression. Knowing whether people are repressed because of what they do (selective
repression) or because of who they are, where they live, and to which identity group they belong (indiscriminate
repression) is crucial because different types of repression and violence produce different responses. Recently, Kocher,
Pepinsky, and Kalyvas (2011) introduced the concept of “collective” targeting, which is selective at the group level but
indiscriminate at the individual level. Thus, when it comes to individual and small-group responses and perceptions—the
key focus of this article—collective and indiscriminate targeting are treated as one. It is also important to note that, as
Table 1 demonstrates, the type and severity of repression are not identical; both can be (relatively) mild or severe, violent
or otherwise. Resistance to each type of repression also necessitates a different set of skills. The political violence (e.g.,
Downes 2007; Kalyvas and Kocher 2007; Lyall 2009) and state repression (e.g., Daxecker and Hess 2013; Dugan and
Chenoweth 2012; Hafez 2003; Mason and Krane 1989; Rasler 1996) literatures have paid substantial attention—though
with mixed findings—to the effects of different types of targeting and repression. However, it is crucial to note that for
these studies the main outcome of interest is the intensity of violence and contention in the short run, during the same
violence/repression campaign. In contrast, this article argues that, to understand the origins of skills and their impact on
resistance, we should account for both current and past events of repression. In this, my approach differs from the
mainstream contentious politics scholarship that links the shape and trajectory of resistance to a co-evolutionary, giveandtake dynamic between the state and the challengers. It shifts the focus to events that predate the outbreak of
contention. In the case of selective repression, when
people are targeted because of what they do
(e.g., political or social activism) they have a choice between ceasing their activities or
sticking to their ideals and risking punishment. If they decide to stick to their
ideals, the activists, to avoid punishment, need to either go underground fully or
adopt a semi-clandestine lifestyle. A study of underground movements in several Western countries
found that activists “experienced state pressure as an immediate, personal threat” and that “those unwilling to accept
arrest and imprisonment had no alternative” to the underground (Zwerman, Steinhoff, and della Porta 2000, 93–94).
Mason and Krane (1989, 180–81) observed a similar process in Central America. Because the targeted population is
relatively small and the risk of individual punishment is high, an underground becomes more likely and easier to organize.
To survive, activists are forced to learn operational security skills (Bell 1989). Some
might be killed or captured, but for the rest every additional day in the
underground provides an opportunity to better learn the toolkit. Because repression is
selective, the rest of the population does not have to worry about their safety. They have no incentives to go underground
and, by extension, acquire the resister’s toolkit. Indiscriminate repression works differently. Extensive evidence from
numerous case studies as diverse as the besieged Sarajevo (Macek ˇ 2009), China during the human-made famine of the
1950s (Jisheng 2012), and the Second Intifada suicide bombings in Israel (Somer et al. 2007) shows that the typical
responses to indiscriminate violence are acceptance of and adaptation to the situation as the “new normal,” fear, and
demobilizing feeling of powerlessness, rather than mobilization. Under indiscriminate violence, the threat is distributed
among a much larger group than under selective violence, and hence overcoming the collective action problem and
organizing an underground become much harder. The incentives structure changes only when the threat becomes
imminent, lethal, and immediate (Maher 2010; see also Goldstone and Tilly 2001). Before that point, organizing
resistance and going underground expose the first movers to an increasing, rather than decreasing, risk of punishment.
Yet, when the danger is already immediate and lethal, aspiring unskilled resisters will be unlikely to survive in such an
environment; they might simply not have enough time and opportunities to learn and adapt. At this stage, the key
problem is not free-riders, but loose lips, snitches, and sheer incompetence. However, if
some members of the
community were subject to past selective repression and already possessed the
needed skills, the transition from initial mobilization to sustained resistance
becomes more likely.
2nc surveillance = better for resistance
Blatantly increasing surveillance can spur more effective social
resistance
Martin 7 (Brian, Professor of Social Sciences at the University of Wollongong, Australia.
“Opposing surveillance”, http://www.bmartin.cc/pubs/07Michael.html//Tang)
Surveillance is commonly carried out in secret. When people don't realise it's
happening, they are far less likely to become concerned about it. The secrecy covering
surveillance is part of a wider pattern of government and corporate secrecy (Roberts 2006). Political
surveillance of individuals is normally done surreptitiously. Bugs are installed in residences;
telephones are tapped; remote cameras record movement; police in plain clothes observe at a discrete distance. There
is an obvious reason for this: targets, if they know about surveillance, are better
able to avoid or resist it. But secrecy is maintained beyond operational necessities: in most cases, the
existence of surveillance is kept secret long afterwards, often never to be revealed.
Exposures may require exceptional circumstances (Marx 1984), such as the collapse of East Germany's communist regime
or the "liberation" of FBI files at Media, Pennsylvania in 1971 by the Citizens' Commission to Investigate the FBI (Cowan et
al. 1974). When
surveillance is exposed, for example FBI surveillance of individuals
such as Martin Luther King, Jr. and John Lennon, it can cause outrage. The
revelation that the National Security Agency had been spying on US citizens since
2002 caused a massive adverse reaction. Employers sometimes do not want to
tell workers they are being monitored, when there is a possibility this may
stimulate individual or collective resistance. (On other occasions employers are open about
monitoring, when this serves to induce compliance.) Under the US Patriot Act, the FBI can obtain secret warrants to
obtain records from libraries, Internet service providers and other organisations. The organisations subject to this
intrusion cannot reveal it, under severe penalties. This draconian enforcement of secrecy serves to reduce personal and
popular concern about surveillance, for example when the Patriot Act is used against non-terrorist groups such as antiwar
protesters. In some cases, surveillance becomes routinised, so cover-up is less important. In many areas, camera
monitoring is carried out openly: it is possible to observe oneself, on a screen, walking into a shop.
On the other
hand, some forms of surveillance are hidden so effectively that they are
completely outside of most people's awareness, for example collection of web
data, meshing of database files, police checks on car licence numbers and
recording of bank transactions. The importance of low visibility in enabling surveillance to continue and
expand is apparent through a thought experiment: imagine that you received, at the end of every month, a list of instances
in which data had been collected about you, by whom and for what purpose. Imagine knowing whether you had been
placed on a list to be denied a loan or a job. Exposing
surveillance is crucial to challenging it.
Exposure requires collection of information, putting it into a coherent, persuasive
form, providing credible backing for the evidence, and communicating to a
receptive audience. Sometimes a single person can do all of these steps, collecting
information directly and publishing it on the web. Normally, though, a chain of participants is involved, for example an
insider who leaks documents, a researcher who prepares an analysis, a journalist who writes a story and an editor or
producer who publishes it. Campaigners help in exposure, as with Privacy International's Big Brother Awards for
organisations with bad records in threatening privacy.
Surveillance relies on multiple methods to repress resistance
Martin 7 (Brian, Professor of Social Sciences at the University of Wollongong, Australia.
“Opposing surveillance”, http://www.bmartin.cc/pubs/07Michael.html//Tang)
Over the years, many people have opposed surveillance, seeing it as an invasion
of privacy or a tool of social control. Dedicated campaigners and concerned citizens
have opposed bugging of phones, identity cards, security cameras, database linking and many other types of surveillance.
They have lobbied and campaigned against abuses and for legal or procedural
restrictions. Others have developed ways of getting around surveillance. In parallel
with resistance, there have been many excellent critiques of surveillance, exposing its harmful impacts and its role in
authoritarian control (e.g., Dandeker 1990; Gandy 1993; Garfinkel 2000; Holtzman 2006; Lyon 1994, 2003; Marx 1988;
Murray 1993; Rosen 2000). However, comparatively little is written about tactics and strategy against surveillance.
Indeed, social scientists have little to say about tactics and strategy in any field (Jasper 2006: xii-xiii). My aim here is to
present a framework for understanding tactics used in struggles over surveillance. Actions
that are seen to be
unfair or to violate social norms can generate outrage among observers (Moore 1978).
Nonviolence researcher Gene Sharp (1973: 657-703) found that violent attacks on peaceful protesters
– something that many people see as unjust – could be counterproductive for the
attackers, generating greater support for the protesters among the protesters'
supporters, third parties and even the attacking group. Because of this potential
for attacks to be counterproductive, attackers, by design or intuition, may take steps to
reduce possible outrage. By examining a wide range of issues – censorship, unfair dismissal, violent attacks on
peaceful protesters, torture and aggressive war – a predictable pattern in tactics can be discerned: perpetrators
regularly use five sorts of methods to minimise adverse reactions to their actions
(Martin 2007). 1. Cover-up: the action is hidden or disguised. 2. Devaluation: the target of the action is
denigrated. 3. Reinterpretation: plausible explanations are given for the action. 4.
Official channels: experts, formal investigations or courts are used to give an appearance of justice. 5.
Intimidation and bribery: targets and their allies are threatened or attacked, or given incentives to cooperate.
This is called the backfire model: when these methods are insufficient to dampen public
outrage, the action can backfire on the perpetrator. However, backfire is rare: in most
cases, the methods work sufficiently well to minimise outrage. Consider an
example different from surveillance: police use force in arresting someone. This
has the potential to cause public outrage if the force used is seen as unnecessary,
excessive or vindictive. Police in these circumstances regularly use one or more of
the five methods. If possible, they undertake the arrest out of the public eye. They refer to the person arrested as a
criminal or by derogatory terms. If challenged, they claim arrestees were resisting and that using force was necessary and
carried out according to protocol. They refer those with grievances to official complaints procedures, which almost always
rule in favour of the police. And they may threaten the arrestee with criminal charges should they make a complaint
(Ogletree et al. 1995). On
3 March 1991, Los Angeles police arrested a man named
Rodney King, in the course of which King was hit by two 50,000-volt tasers and
beaten with metal batons more than 50 times. This arrest would have gone
unnoticed except that George Holliday, who lived nearby, recorded the beating on
his new videocamera. When footage was shown on television, it caused a massive public and
political reaction against the Los Angeles police. Holliday's videotape cut through
the normal cover-up and allowed viewers to judge the events for themselves,
overriding the police's interpretation of the events and the media's normal policesympathetic framing (Lawrence 2000). Nevertheless, in the ensuing saga the police and their supporters used
every one of the five methods of inhibiting outrage – though, unusually, in this case their efforts were unsuccessful in
preventing a huge backlash against the police (Martin 2005). Tactics for and against surveillance can be analysed using
the same framework. The
foundation for public outrage is a sense of unfairness. This is
certainly present at least some of the time: people may see surveillance as an
invasion of privacy (as with hidden video cameras), as a tool of repression (as in monitoring
dissenters) or
a tool of exploitation (as in monitoring of workers). The very word "surveillance" is a tool in
opposing it, because the word has such negative connotations. A sense of unfairness is not inherent in
the act of observing someone or collecting and analysing data about them. People's
sense of unfairness is the subject of a continual struggle, with privacy
campaigners trying to increase concern and purveyors of surveillance techniques
trying to reduce it. Methods to inhibit or amplify outrage are used within the prevailing set of attitudes and in
turn affect those attitudes. Given that some people see surveillance as inappropriate,
unfair, dangerous or damaging, there is a potential for resistance and hence it is
predictable that one or more of the five methods of inhibiting outrage will be
deployed. In the remainder of this paper, I look at each of the five methods of inhibiting outrage and ways to
challenge these methods. The five-method classification used here is a convenient framework for examining tactics for and
against surveillance. To
use this framework does not require actors to be consciously
engaging in a struggle, as many are simply reacting to the circumstances in which
they find themselves. For those who are concerned about surveillance, though, it is useful to think in terms of
tactics and strategies.
at: legal solutions solve
Politics and legal solutions can’t change the surveillance state
North, PhD, ’13 (Gary North, PhD UC Riverside “Surveillance state will collapse; data
overload increasingly blinds it” July 29, 2013 http://nooganomics.com/2013/07/surveillancestate-will-collapse-data-overload-increasingly-blinds-it/) //GY
On the next day, July 24, the House of Representatives voted down an amendment to
cut the NSA’s budget — the official one, not the real one, which is secret. It was
Nancy Pelosi who made the difference. She carried the NSA’s water. The failure of Congress
to make a token cut in the National Security Agency’s official budget on July 24
was a green light for the NSA to spy on all Americans, forever.¶ Congress knows
that the voters do not care enough to mobilize against the Patriot Act, which is the heart
of the surveillance state. They also know that most House members are immune from
the voters. Gerrymandering works. They also know that they, personally, are not immune from
the NSA’s monitoring of their telephone calls, emails, and other communications. They can count
the votes. They know who is on top. The surveillance state is on top.¶
The
surveillance state is now unstoppable politically. Legally, there is no
possibility that it will be rolled back. It is now the non-law of the land. Wyden
thinks the voters may roll it back. They won’t. It is unstoppable politically.
Download