Facebook Data Storage Centers as the Archive's Underbelly

advertisement
509415
research-article2013
TVN16110.1177/1527476413509415Television & New MediaHogan
Article
Facebook Data Storage
Centers as the Archive’s
Underbelly
Television & New Media
2015, Vol. 16(1) 3­–18
© The Author(s) 2013
Reprints and permissions:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/1527476413509415
tvnm.sagepub.com
Mél Hogan1
Abstract
As the quintessential digital archive, Facebook no longer requires an introduction;
its user-base is currently estimated at one billion profiles. On the front end, it is the
epitome of the postmodern living archive. Its underbelly, however, remains much less
explored and theorized. What kinds of servers are required to host such large amounts
of “free” information, offering up data so rapidly, across so many platforms? Taken
together, these pragmatic questions inform an important theoretical intervention:
these dislocated centers—existing in “enterprise zones” and arctic hideaways—
not only effectively blind us to the potential environmental costs of our everyday
obsession with self-archiving but also demand a serious revision of the preservation
ideals that underpin the archive. This article offers up a series of provocations about
data storage centers, as the archive’s underbelly, with the intent of reconnecting
Facebook to the bodies and machines that enable it and the ideals that inform it.
Keywords
culture, environment, Facebook, Internet, new media theory, technology
Introduction to the Underbelly
As depicted in the film The Social Network (2010), the idea of Facebook emerged
from a group of math and programming savvy friends, led by Mark Zuckerberg, with
a desire if not desperation to connect and to have those connections serve as visible
markers of their social ranking. That concept proved appealing to the college students
it was first catering to, but it also quickly gained in popularity outside of the academic
realm.
1University
of Colorado, Boulder, USA
Corresponding Author:
Mél Hogan, University of Colorado, Boulder, 1032 Ridglea Way, Boulder, CO 80303, USA.
Email: info@melhogan.com
from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
Downloaded Downloaded
from http://www.elearnica.ir
4
Television & New Media 16(1)
On February 4, 2004, Facebook shifted from a dorm-room pastime to a corporation: more people were hired to run the Facebook website, and space was leased to
match the size of the enterprise and its ambitions. In less than one year, the site reached
one million users, and by September 2012, that number of active members had multiplied a million times. As demonstrated by its unrelenting growth and near global
reach, Facebook has become the de facto living archive, collecting perfect trivia from
and for almost everyone. As The Wall Street Journal put it in 2012, “Mark Zuckerberg
has just six billion more people to go” (Fowler 2012), a number that has dropped to
five billion in less than one year later (Facebook 2013). This user-base generally measures the success of Facebook on its ability to connect billions of people across the
world and the social networking it so easily enables. Conversely, Facebook’s growth
is rarely discussed in terms of a material network or material consequences. More
specifically, the very machines used to manage perpetual user requests are seldom
mentioned in relation to the social network: the servers (Bennett 2010; Berland 2009;
Blum 2012; Chun 2013; Cubitt et al. 2011; Gabrys 2011; Maxwell and Miller 2012;
Parikka 2012, 2011). This issue of materiality is often sidestepped on blogs and in
journalistic writings in favor of stories that cover the platform’s potential for advertising, predictions about its future (and the future of the Internet), and check-ins on Mark
Zuckerberg (Locke 2007). The material impacts of Facebook’s perpetual feed, as a
streaming archive, also remain understated in academic research, addressing instead
urgent policy, privacy, and surveillance concerns (Cohen 2008; Shepherd 2012)—the
ownership of user-generated content and capitalization through “big data” (boyd and
Crawford 2011), and identity and user behavior analyses (Marshall 2012) as few
examples in a growing body of literature dedicated to the platform.
Facebook has not been carefully addressed as an environmental concern in media
scholarship for a multitude of reasons. Dubbed as our “love affair with technology,”
Maxwell and Miller (2012, 4) argue that much of the critique within media and communication studies demands self-reflexivity; what facilitates our research is precisely
the devices that are at our disposal, the very technologies scrutinized here, through
Facebook. In other words, there is a privilege inherent to research that not only investigates new media as a site of inquiry but also draws from it as its main (if sometimes
only) source of information and its most important tool and method (Mayer 2012).
Within the realm of environmental media, ignorance becomes part of the equation
largely because of the rapid obsolescence of technology, which not only serves to
break the connection of users to “old” devices but also makes invisible the intergenerational effects and impacts both people and planet (Gabrys 2011; Maxwell and
Miller 2012). Like the subject at hand here—computer servers—much of the how and
who and why remain at the level of the sublime and the magical.
Many questions remain unanswered about Facebook’s social, political, material,
and environmental impacts. These questions are crucial to a scholarly investigation
about Facebook and any or all analyses that attempt to address digital circulation
through social media. The emphasis here is on the relationship between such vast data
streams and their containers, and, more precisely, how the understated connection
between personal data and storage demand a re-visioning of the so-called living
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
5
Hogan
archive through data storage centers. The series of questions that launch my exploration are, “What kind of infrastructure and technologies are required to host such large
amounts of ‘free’ information, offering up data so rapidly and across so many platforms?” “How does Facebook’s advertising strategy inform how power is pulled from
the grid? How do its servers function?” “How are they powered?” “How many are
there?” “Where are they located?” “What are the database logics (and assumptions)
that inform the relationship between Facebook and the archive?” “What is wasted?”
The intention of my intervention is to bring to light the questions themselves, and
while some of them are answered in this piece, the scope and vastness of the issues that
stem from the question themselves serve as a call to media scholars to take on and
expand from what is raised here.
Taken together, these pragmatic questions inform an important theoretical intervention: these dislocated servers—existing in “enterprise zones” and arctic hideaways (i.e.,
the “Node Pole”)—not only effectively blind us to the potential environmental costs of
our everyday obsession with self-archiving but also demand a serious revision of the
preservation ideals that underpin the archive (O’neill 2011; Srinivasan 2009). If, as
stated by Jennifer Gabrys (2011, 39), “Information, in all its fleeting immateriality,
bears a direct relationship to this landscape,” then what choices are we making about
how this so-called global online “archive” should run? What are our expectations? Who
benefits? What are the costs? And (how) are these impacts measured? My reading of
the issues raised by these questions has been that the disconnect between the materialities of the Internet and the culture that develops from it reveal a cycle of waste that is
not only about devices and technologies but also about identity and meaning-making.
Following this, the archival impetus that has long been instated to document and preserve stories and histories—even, and perhaps especially, through critiques of the
archive (Arondekar 2005; Cvetkovich 2003; Spivak 1999; Takahashi 2007)—is seriously displaced by data repositories of this ilk. This is why the archive becomes an
effective lens through which to explore data centers, by posing questions about what
it—big data aggregation—disrupts about the way we can understand ourselves over
and through time. The archival framework, insofar as it can allow for considerations of
continuity and intergenerationality, also demands that the cycle of waste consider the
relationship of data in relation to the means of production and disposal that flank the
culture of use. All of this is contained, framed, and housed within the politics of data
server storage. Where server farms are located, and how they affect those communities,
is presented here against justifications based on efficiency as well as our growing reliance on the Internet to tell us who we are and who we are connected to.
In Digital Rubbish: A Natural History of Electronics (2011), Gabrys nuances the
incarnation of digital information with a focus on the devices on and through which
information travels. Countering new media hype, she brings attention to the cultural
processes that make media fail, and in turn, the politics and ramifications of (often
planned) technological obsolescence. Similarly, in Greening the Media (2012),
Richard Maxwell and Toby Miller diagnose an increasingly wasteful culture, enhanced
and encouraged by devices that are quickly replaced and put out of use. In this book,
they also report on data servers that use the same amount of electricity as small cities,
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
6
Television & New Media 16(1)
and consequently, on the exponential growth of electricity consumption, and the invisibility of the damages incurred by the overconsumption of online data.
What follows is aligned with Gabry’s, Maxwell and Miller’s (and others’), challenge of weaving together the political, spatial, infrastructural, and social impetuses of
the digital archive situated in an “always already material landscape.” As such, this
piece offers up a series of provocations about data storage centers—hidden in plain
sight—as the archive’s underbelly, with the intent of reconnecting Facebook to the
bodies and machines that enable it, and the ideals that inform it. It also serves to
demystify the materialities of the Internet as informed by emergent archival theory and
as necessarily interlocked with environmental concerns—two fields of research that
for now remain neglectfully separate.
Dirty Data
In 2013, Facebook no longer requires a detailed introduction; an update of its userbase, which is currently estimated at one billion active monthly users (give or take a
few fake accounts), is normally sufficient to make a point about its social impacts. On
the front end, it is the epitome of the user-generated platform and of the living archive.
Its underbelly, however, remains much less explored, theorized, or accurately imagined: “The architecture of data storage is inhuman. Servers don’t need daylight, so the
spaces are lit by blinking power lights or eery fluorescence instead of windows”
(Chayka, 2012). Beyond the failed aesthetics of this inhuman yet pulsating underbelly,
the reason intellectual and creative focus keeps to the front end is, for the most part,
obvious: it is an addictive social network where everything exciting, funny, dramatic,
and important is happening in real time, all the time. It is visible and interactive. The
social network lets you reconnect with long-lost friends, becomes your virtual business card, and forces you to track your own life along a timeline (Wehner 2012).
Facebook is so pervasive that it eats up anywhere from 9 percent (Protalinski 2012) to
25 percent (Van Camp 2010) of Canada’s and the U.S.’s Internet traffic. In 2012, it
accounted for one out of every seven minutes spent online (Protalinski 2011). We collectively “like” things two million times a minute (Leber 2012). We upload three
thousand photos every second. We ingest more than five hundred terabytes of data
every day (Chan 2012). Our usage seems infinite, if not humanly incalculable.
However, critics have been many to question Facebook’s raison d’être. A quick
Internet search reveals that everything from its shady terms of use to its dubious advertising models has been openly picked apart by technologists, scholars, and users alike.
Ironically, the platform is so ubiquitous that it itself sometimes becomes the vehicle,
mobilizing agent, and tool for dissemination for these causes (Greenpeace: Unfriend
Coal 2012). While much has been done to identify and problematize the ways in which
this kind of ongoing activity can be constituted as free labor (Terranova 2004) or
immaterial labor (Hardt and Negri 2000), even these important scholarly contributions
have failed to draw attention toward other kinds of invisibilities—those that facilitate,
and literally power, our nonstop networking on and through various interconnected
devices. To the point of my intervention, the central problem revolves around how
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
7
Hogan
these invisibilities are conveniences that facilitate the blind and perpetual archival
machine and, in turn, inform or distort the politics of preservation on which the archive
is built.
According to a Greenpeace (2011) report—How Dirty Is Your Data?—the Internet
servers consume upward of one and a half percent of our global electricity. Understood
in more concrete terms, this means that if the Internet (i.e., cloud computing) were its
own country, it would rank fifth in global electricity use (Greenpeace Blog 2011).
According to the same report, Facebook’s U.S.-based data centers are each consuming
the electricity of approximately thirty thousand U.S. homes. While at the quintessential hub, pulse, and record, of the action, Facebook alone can hardly be made responsible for the collective consumption and data-sharing it enables. Users participate in
this mass exchange. They are consumers of both the data and the electricity it requires
to flow bits down the tubes. That being said, Internet flows are not all equal; Facebook
carves out the means by which its current is consumed. As one blog commentator
points out, “Facebook is a bandwidth hog; it forces frequent reloads of complex, dataheavy pages so as to increase its ad presentations (and, hence, its revenues)”
(Higginbotham 2012). Another blog comment reinforces this idea by suggesting that
“Facebook is an advertising company, with a social media platform built in”
(Shoebridge 2012). Considering the strategy that informs this potentially excessive
use of electricity—bandwidth, content reloading, and refreshing—what are the politics of this kind of power purging? Could Facebook reduce its pull from the grid by
devising an alternative, “green,” advertising strategy? Users have reported using
AdBlock (a browser extension) to diminish their bandwidth use from Facebook ads,
but this falls short of elucidating the complexities of the concerns it raises (PC Help
Forum 2012). This example serves not the explicit intention of debating ongoing ecological and ethical choices made by the company but rather to point to one instance
that suggests that our interactions with data, and the movement of data online, are
imbued with a rarely discussed push/pull politic. Arguably, this politic is about the
archive as it will increasingly be understood and defined.
What constitutes the archive has been a topic of much debate, especially with the
advent of user-generated sites, which often bypass traditional processes and yet supplant archival institutions in terms of both size and technologies (Stalder 2008).
Internet Archive’s Wayback Machine, which collects snapshots of the Web’s past;
YouTube’s incomparable video collection; and Google’s scanned book project are
each examples that have opened up the idea of large-scale online repositories as
archives. As a continuation of this (still disputed) (re)definition, attention should be
drawn to the invisibilities of the archive surfaced by Facebook and its underbelly.
Comparable assessments could be made for Apple, Google, YouTube, Amazon, and
so on, all of which share attributes of this growing archival conundrum around storage
and electricity that sidesteps access at a very material, concrete level.
Everything we do online, and on Facebook, triggers servers as a means to locate
and return data.
This is becoming ever more the case with the shift to apps and streamed content,
which favors constant connection to a decentralized database in the cloud, over local
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
8
Television & New Media 16(1)
(i.e., saved to your computer or hard drive) media collections. Data storage exists as
clusters of clusters: rows of stacked servers inside stadium-like data centers, and
increasingly, in zones reserved for these facilities to proliferate as “enterprise zones”
(Esteve 2012). The size and location of these data centers are directly correlated to our
digital demands: Web searching, profile updating, poking, liking, watching videos,
making purchases, file sharing, messaging, and so on. These perpetual demands—
doubling globally every eighteen months (Walton 2012)—require a lot of energy from
servers, which in turn generate a lot of heat. To avoid a meltdown—literally—energy
is, therefore, also required to cool down the computers (Teicher 2012). While the ecological impact of these transactions—heating and cooling—is certainly at the heart of
any critical analysis of Facebook such as this one, it is the justifications themselves
(for the energy spent) that offer the richest theoretical terrain to explore.
Facebook does not reveal how much energy it needs to maintain its estimated
180,900 servers, but electricity remains the company’s largest expense (Gruener
2012). Facebook’s data storage centers are among the tens of thousands of data centers
that have been constructed to support the explosion of digital information in the last
few years, lifting power from the largest power grid in the world, most of which is
derived from burning fossil fuels (Martin et al. 2013). These costs have pushed companies with large data sets, such as Facebook, to develop more “efficient” data centers,
though this efficiency has only slight resonance with ecological thinking, as most data
centers remain coal-powered (and therefore further increasing fossil fuel use), despite
the plausibility of carbon emissions free alternatives (Cubitt et al. 2011; Kaufman
2010). For companies who rely on storage centers of this ilk, efficiency is, instead, a
matter of speed; to deliver data more effectively is by logic a reduction of the energy
used to achieve the same task. As such, the Facebook archive is adapting to increasing
demand not by measuring its ecological and physical impact but rather by investing
further in its ability to do more within the same facilities, by upgrading its technology
(as exemplified by its Open Compute initiative, sharing plans for storage center architecture and best practices).
Faces of the Archive
This upgraded archive is always “on” and always able to deliver content. But by the
same token, it exists in a state of constant potential. Facilities operate at their full
capacity at all time, regardless of the actual demand, which means that an incredible
amount of energy is reserved for idling.
The entire process—much of it redundant—is constantly backed up (often using
polluting generators), in case of a power outage, activity surge, or glitch in the system, to ensure immediate and seemingly uninterrupted service. Systems are monitored around the clock; engineers are on hand to analyze and resolve production
issues twenty-four hours a day, seven days a week; and staff remain on call to
respond to problems, simultaneously generating and dealing with a poverty of excess
(Rossiter 2012).
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
9
Hogan
This may be the single most telling insight about the archive—the ideal of instantaneity imparted onto it by users who are simultaneously creating and subjected to such
an unsustainable modality. The cost of such instantaneity is that almost all the energy
that goes toward preserving that ideal is, literally, wasted. As documented in The New
York Times, more than 90 percent of servers is reserved for and used for standby only,
while the remaining 10 percent is used for computation (Glanz 2012). These figures
continue to grow in tandem as demands multiply: but to what end?
As Cubitt et al. (2011) argue, in an inquiry into Google servers similar to mine with
Facebook servers, sustainability is very much connected to our understanding of how
the Internet works and how digital artifacts circulate. Both rely on material infrastructures. Information online in not, in fact, green, weightless, and immaterial. As other
natural resources previously underestimated and mismanaged—oceans and forests,
for example—markets and digital information are new, and perhaps more ephemeral
confrontations about excess, pointing to a limit in terms of growth. Rationing information will become a necessary step given the current expansion rate of data, a model
currently set to fail because it denies its own limitations. Data collection in storage
centers like Facebook is based on the idea that we can continually match the growth of
data to physical storage centers.
Data centers manage not only the constant streams of data but also “past” data,
including bytes upon bytes of user data that lies dormant, abandoned blogs and cached
e-mail, filling server upon server. In 2010, Facebook was storing more than 135 billion
messages per month to parse volatile temporal data from rarely accessed data. This
means that copies of the data are optimized for the age of the content (Essers 2012) and
that not all data are considered equal in the archive’s fabric. As Cubitt et al. make obvious, archivists have known for decades that appraisal is a political and intentional
gesture of filtering important information within a specific context. It is explicitly
about not accumulating everything. That problem of the mentality of Facebook (and
more recently, the NSA’s plea to “collect it all”) is that it “spills out of libraries and
media archives into public space” (Cubitt et al. 2011, 155; Greenwall 2013). For
Facebook—straddling the corporate archive and public space—this preservation concern is about perfecting the ability to quickly respond to live data demands by storing
older data in a more cost-effective manner, a solution said to be located in layers of
software rather than one drawing from theories of appraisal or frameworks to establish
value.
One telling anecdote that challenged the way Facebook determined layers of data
and user access to the past is that of law student Max Schrems, of Vienna, Austria,
who under EU law was entitled to request his data set from Facebook (O’Brien 2012).
In December 2010, after using the social network for three years, he demanded from
Facebook a copy of all the information they had collected through his profile: he was
sent a 1,222-page PDF (O’neill 2011). This PDF outlines “records of when Schrems
logged in and out of the social network, the times and content of sent and received
messages and an accounting of every person and thing he’s ever liked, posted, poked,
friended or recorded” (Donohue 2011; Europe vs. Facebook 2012). In this same
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
10
Television & New Media 16(1)
article, Schrems is said to have remarked his amazement at the time about “how much
it remembers” and “how much it knows”—deleted posts or posts that are set to “private” fall into the same data bank as public posts in the Facebook archive (Cheng
2009). In this way, Facebook forcibly collects not only media assets but also—and
more importantly—it tracks data on the minutia of our habits, locations, and connections, which begin to shed some light on the potentially darker and more evasive and
contradictory value(s) of the social network. Facebook recently removed the option to
download “wall” posts (Delta 2013). They argue for a “shared knowledge” economy
in their “Is Connectivity a Human Right?” document, where they write, “If you know
something, that doesn’t stop me from knowing it too. In fact, the more things we all
know, the better ideas, products and services we can all offer and the better all of our
lives will be” (Facebook 2013). They are using an argument that normally pushes
progressive solutions to knowledge management (Lessig 2004, 2007; Murray 2004), a
“non-rivalry in consumption” position that suggests that unlike material objects, sharing is not limited to a number of copies available. However, in a context where privacy
is completely overlooked, sharing means something utterly different. Yet, for
Facebook, there is no difference between data; no notion public and private.
Facebook has since made it much more difficult for users to access their data in the
manner Schrems was able to, according to Europe versus Facebook. Users can now
gain access only to their own profile archive, and thus a fraction of the information
collected, without a turnaround-time delay imposed by law. Facebook considers its
user data as its intellectual property and retains, among other things, face recognition
data as a trade secret. However, data collection is not contained to use within the platform: even when users are logged out of Facebook, its cookies continue to crawl and
gather data on users’ clicks (Cubrilovic 2011). By simply reading an article on the
Web, or listening to a song online, for example, that content can be shared on behalf
of the user through Facebook, without their explicit consent (Hacker News 2011; Hill
2011). On January 15, 2013, Facebook announced a tool called Graph search, which
according to Zuckerberg, puts data back in the hands of its users, allowing them to dig
through years of data collection from friends: “It knows which parks your friends like
to take their children to, or which pubs they like to visit, and who among their network
is single and lives nearby” (Sengupta 2013). The problem with this archive is that
aggregation says more about us than we consciously know we are making available.
Tracking at all these levels demonstrates the extent to which the social network itself
generates a parallel archive, of movement, recording the interactions of the network
itself, as a simultaneous—but exponentially bigger—living archive. This parallel
archive may come to make correlations about ourselves about which we are not yet
aware. But, given the distance from our data and the kind of storage afforded by mobility and the cloud, users remain detached from the contradictions, which, I argue, are
embedded in the process itself. Framed this way, the living component reemerges with
the framing of Facebook as archive with a particular point of emphasis: permanent
exchange between nodes, rather than storage (Ernst 2003) or, as Chun (2013) proposes, technological memory as a “technological organ,” collectively (re)constructed
(and recontextualized) in the present rather than collected and preserved from the past.
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
11
Hogan
From Crook County to the Node Pole
Most users are unaware of the processes involved in being online, where a simple
Facebook status update can travel thousands of kilometers in Internet conduits through
numerous data centers, processing tens of thousands of individual pieces of data,
before “arriving”—in a matter of seconds—to its (various) destinations. With these
processes, the Internet has completely thwarted our notion of time and of space. The
fact that the essence of Facebook exists in undisclosed and highly protected storage
centers only heightens the idea that this distance between users and the data they generate (in terms of content, habits, and networks, etc.) is necessary to maintain the
archival illusions of continuous uninterrupted access.
Locating the specific sites of Facebook’s data centers is next to impossible for
researchers, as companies do not disclose this information and the sites themselves—
often in innocuous and far away buildings—are heavily protected. Images of these
recent data storage centers are also, as of yet, not part of the Google Maps and not yet
“surveillable” via Google earth. This guarded distance between users and their data is
painfully provocative.
To support the growing activity of its social network since 2004, Facebook has
built several data centers, including its first non-U.S. facility. This offshore storage
center is made to metaphorically accommodate the 70 percent of Facebook users who
live outside the United States. Facebook also leases server space in nine or so data
centers bicoastally (Miller 2011).
In 2010, Facebook built its first data storage center in Prineville, Crook County,
Oregon, at the cost of 210 million dollars (Bacheldor 2012; Van Allen 2012). The storage center in Prineville was built on vacant grounds, on a high plain above the small
town, exposing its 147,000 square feet of beautiful architecture, while remaining conveniently out of sight. The facility created fifty-five jobs at Facebook proper, while the
construction and initial setup have been said to have reinvested seventy-five million
back into the local economy (Cascade Business News 2012; Laycock 2012). The power
used at this storage plant to power the social network matches the power used by all of
Crook County (twenty-eight megawatts of power). In terms of electricity consumption,
the requirements of virtual life are matched to those of surrounding lived realities.
However, according to journalist Andrew Blum, the trade-off seems entirely justified
given the efficacy of these storage centers in the services they provide (2012).
Crook County is an industrial zone—and this becomes one of the favorable sustainability arguments as these are areas are themselves recycled from energy-intensive
industries that once occupied the grid “so there was excess capacity on the grid to be
tapped and no new power plants were required to serve their energy needs” (Ecotrope
2012). This, along with huge tax breaks, makes it highly affordable for expansive
“nonutility” companies like Facebook to sprawl out. The justification for the location
of the center in Crook County is, however, a supposed concern over seismic risks and
temperature: a cool climate makes the operation less costly as cooling servers remains
a huge problem. Despite this, Facebook also at this time designed servers that could
withstand more heat (nine or so degrees more) so as to require less cooling.
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
12
Television & New Media 16(1)
Even before putting this first Prineville center online, Facebook announced in
November 2010 the construction of its second storage center—this one twice as
large—in Forest City, Rutherford County, North Carolina. This center—which will
see forty-two new jobs created at Facebook—replaces a former factory overtaking the
now unraveled textile industry, demolished and cleared for the construction of the new
450-million-dollar storage facility, designed much like the previous one in Prineville.
Also, in keeping with the “industrial zone” notion in Oregon, Rutherford County,
Governor Bev Perdue hopes that Forest City will refurbish these power ruins to
become “a global business destination” (Crowley 2010) where special tax breaks
crafted specifically to appeal to “data-centric companies like Facebook” are meant to
encourage others to follow suite (Bracken and Pittman 2010). Not all were in favor of
its development. Ellie Kinnaird, a Carrboro Democrat, voted against the data center
bill summer 2010 because, as she puts it, “This is just an outpost for big servers . . .
Have we brought that stimulating, intellectual talent here? No” (Bracken and Pittman
2010). In this sense, Facebook storage centers seem to function more as gated islands
than integrated facilities, but the profile page dedicated to the center contradicts this
perception for the virtual world looking in: it boasts of its ecological awareness, community involvement, and local developments. In this example, if we apply the same
logic as Schrems anecdote, of Facebook as an archive that tracks itself and generates
its own timeline and set of correlations, the Forest Hill data center’s history will be
fully imbued with ecological awareness, and Facebook will have the data to “prove”
it. The histories created by Facebook data are significant, especially if and as it serves
as the most visible, most widely accessible, and the most detailed public record, or to
what Parikka (2013) might refer to as a form of pollution of and within a media ecology. Media ecology, in this example, serves to shift the largely reinforced dichotomy
between nature and consumer, of nature. It further demonstrates the extent to which
our very understanding of the ecological crisis is mediated and mitigated through technologies, limiting if not posing a challenge to the tools at our disposal to counter platforms that have come to dominate the media landscape, such as Facebook.
The third and most recent storage center to be built by Facebook is to be in Lulea,
Sweden, a town of fifty thousand residents. Here, again, and perhaps more believably,
Lulea is an ideal location with its cold climate serves with the hopes of working off
electricity derived entirely from renewable sources. Its regional power grid is said to
be extraordinarily reliable—no disruption of service since 1979—leading Facebook to
cut out seventy percent of the generators it relies on for backup at the U.S. facilities,
which in turn means less diesel fuel storage on-site and fewer emissions. At all locations, the backup generators are generally tested monthly, so the reduction is important. The ultimate goal is, presumably, to follow Google’s and Yahoo!’s lead and use
the network itself to reroute data in case of a power outage at a particular facility
(Miller 2011). This strategy requires major network capacity and multiple data centers
and so remains a possibility only for very large-scale operations.
With arctic temperatures, the area has a natural way to cool servers, and, according
to CNN, the town has “cheap and plentiful electricity” (Mann and Curry 2012). As
with the previous two sites, in addition to cheap electricity, cool temperatures, and tax
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
13
Hogan
breaks, Lulea also has few inhabitants and fewer prospects still for economic growth,
beyond the enticing option of opening up the “Node Pole” to Facebook and other data
centric companies. In all cases, desperation seems to be a locative factor, mitigated
only by the fact that the clusters of storage centers are out of sight and out of mind.
This third Facebook facility—which is in itself three complexes—is estimated to be
fully operational by 2014. Each of the three complexes is equal to the one in Forest
Hill, which was itself double the size of the previous one in Prineville. Like the data
growth itself, the storage centers are proliferating at exponential rates, in size and
speed.
Living Archive
Facebook’s data storage centers are recent innovations. An early exploration offers a
vantage point that allows for reflections on the matter that informs the history of the
Internet at a time when data growth is starting to become, among other things, an issue
of control and containment. As demonstrated in this short article, the consequences are
at once immaterial and material. This duality reminds us of the ways in which digital
culture functions: largely because of the metaphorical distance we afford it (from our
mobile/wireless devices) and due to the ways in which we justify our needs for connectivity despite the complexities with which we are passively confronted. In this way,
data storage centers are the hidden monuments serving as perfect metaphors for our
current priorities.
If we imagine Facebook as a prototypical global archive in the face of mass data
creation and circulation—as our billion plus participation seems to indicate (Vance
2012)—we are faced with the “always on, always available” connections it enables
through us (and perhaps our own desire to always be on). By looking at the politics
of these data centers, we come to understand the material space of the Facebook
archive, the electricity that powers the machines, and a virtual ethersphere that produces bigger records than the lived realities it records, as a politic of preservation
that is, on the one hand, successfully inhabited and, on the other hand, dangerously
reconfigured and protected as such. This brings us back to the notion of the living
archive, beyond the metaphysical hope of becoming immortal by being someday
wholly uploaded into a computer (Lanier 2006). Lanier’s metaphor, in dialogue with
Geert Lovink and Wolfgang Earnst, serves to conjure up insightful interventions
into how the living archive distorts time. A present time, against which a past time
is compared, displaces and dissolves the emphatic in favor of the flow, the context
in favor of what connections allude to. However convincing the tie between the living archive and life itself seems to be, for media scholar Lovink (2013)—in conversation with Ernst—what is embodied is no more alive or dead in terms of the ability
to trigger memory. He argues that outside of institutions mandated to deal with
knowledge management and preservation, the archive can increasingly be understood as embodied and as built into social networks if not in people themselves. For
Lovink, we are the “living” entities of the archive, rather than stale and static documents. To this, Ernst adds “in this hegemonic ideology, knowledge only exists if it
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
14
Television & New Media 16(1)
is up-to-date and can operate strategically, not hidden somewhere in a database,”
further invoking, in light of Facebook, both the urgency to reconnect bodies to
machines and data, and to draw attention to the machines that risk making sense of
data, to then organize bodies accordingly. The Facebook archive can easily be
framed as panopticon, an archive of surveillance that can make finally tuned predictions serving specific ends. Framing Facebook as an archive then, as opposed to
mere storage or container, is important in that it brings to surface the political connections between data and the ideals about the past and future that underpin and
continuously reshape what we mean by life, death, bodies, and memories and their
preservation.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of
this article.
References
Arondekar, Anjali. 2005. “Without a Trace: Sexuality and the Colonial Archive.” Journal of the
History of Sexuality 14 (1 and 2): 10–27.
Bacheldor, B. 2012. “Facebook’s Data Center Tax Bill Shrinks,” IT World, January 30. http://
www.itworld.com/data-centerservers/245675/facebook-s-data-center-tax-bill-shrinks.
Bennett, Jane. 2010. Vibrant Matter: A Political Ecology of Things. Durham: Duke University
Press.
Berland, Jody. 2009. North of Empire: Essays on the Cultural Technologies of Space. Durham:
Duke University Press.
Blum, Andrew. 2012. “What is the Internet, Really?” TED, September http://www.ted.com/
talks/andrew_blum_what_is_the_internet_really.html.
boyd, danah, and Kate Crawford. 2011. “Six Provocations for Big Data.” Decade in Internet
Time: Symposium on the Dynamics of the Internet and Society, September 21. Last modified October 30, 2012. http://papers.ssrn.com/abstract=1926431.
Bracken, David, and Kirten Valle Pittman. 2010. “Facebook Data Center Heads to N.C.”
NewsObserver.com, November 12. http://www.newsobserver.com/2010/11/12/797452/
facebook-data-center-heads-to.html.
Cascade Business News. 2012. “Facebook Unveils Economic Impact Study,” January 24. http://
www.cascadebusnews.com/news-pages/e-headlines/1814-facebook-unveils-economicimpact-study.
Chan, Casey. 2012. “What Facebook Deals with Everyday: 2.7 Billion Likes, 300 Million Photos
Uploaded and 500 Terabytes of Data.” Gizmodo, August 22. http://gizmodo.com/5937143/
what-facebook-deals-with-everyday-27-billion-likes-300-million-photos-uploaded-and500-terabytes-of-data.
Chayka, Kyle. 2012. “The Aesthetics of Data Storage.” Hyperallergic, October 9. http://
hyperallergic.com/58330/the-aesthetics-of-data-storage/.
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
15
Hogan
Cheng, Jacqui. 2009. “Are ‘Deleted’ Photos Really Gone from Facebook? Not Always.” Ars
Technica, July 3. http://arstechnica.com/business/2009/07/are-those-photos-really-deletedfrom-facebook-think-twice/.
Chun, Wendy, Hui Kyong. 2013. “Media Archaeology PROGRAM: Media and Literature,
Graduate Student Event, March 1.” http://vimeo.com/64248011.
Cohen, Nicole. 2008. “The Valorization of Surveillance: Towards a Political Economy of
Facebook.” Democratic Communiqué 22 (1): 5–22.
Crowley, Tim. 2010. “Governor Perdue: Facebook ‘Likes’ North Carolina Announces Social
Networking Leader to Build Data Center in Rutherford County.” http://www.highbeam.
com/doc/1G1-241959422.html.
Cubitt, Sean, Robert Hassan, and Ingrid Volkmer. 2011. “Does Cloud Computing Have a Silver
Lining?” Media, Culture & Society 33 (1): 149–58.
Cubrilovic, Nik. 2011. “Logging out of Facebook is not enough.” New Web Order, September
25. http://www.nikcub.com/posts/logging-out-of-facebook-is-not-enough.
Cvetkovich, Ann. 2003. An Archive of Feelings: Trauma, Sexuality, and Lesbian Public
Cultures. Durham: Duke University Press.
Delta. 2013. “Facebook Removes Downloads of Your Posts.” Angrymath, June 5. http://www.
angrymath.com/2013/06/facebook-removes-downloads-of-your-posts.html.
Donohue, Brian. 2011. “Twenty Something Asks Facebook for His File and Gets It—All 1,200
Pages.” Threat Post, December 13. http://threatpost.com/en_us/blogs/twenty-somethingasks-facebook-his-file-and-gets-it-all-1200-pages-121311.
Ecotrope. 2012. “Shrinking the Carbon Footprint of Facebook.” OPB, May 3. http://ecotrope.
opb.org/2012/05/shrinking-the-carbon-footprint-of-facebook/.
Ernst, Wolfgang. 2003. “Interview with Wolfgang Ernst: Archival Rumblings.” Geert Lovink,
February 20. http://geertlovink.org/interviews/interview-with-wolfgang-ernst/.
Essers, Loek. 2012. “Facebook to Use ‘Cold Storage’ to Deal with Vast Amounts of Data.”
Computerworld, October 17. http://www.computerworld.com/s/article/9232489/Facebook_
to_use_cold_storage_to_deal_with_vast_amounts_of_data.
Esteve, Harry. 2012. “Oregon Legislature Ensures Facebook’s Prineville Pant Doesn’t Face
Higher Taxes.” Oregonlive, February 22. http://www.oregonlive.com/politics/index.
ssf/2012/02/oregon_legislature_ensures_fac.html.
Europe vs. Facebook. 2012. “Facebook’s Data Pool.” http://europe-v-facebook.org/EN/Data
_Pool/data_pool.html.
Facebook. 2013. “Is Connectivity a Human Right?” https://www.facebook.com/isconnectivityahumanright/isconnectivityahumanright.pdf.
Fowler, Goeffrey A. 2012. “Facebook: One Billion and Counting.” The Wall Street Journal,
October 4. http://online.wsj.com/article/SB10000872396390443635404578036164027386
112.html.
Gabrys, Jennifer. 2011. Digital Rubbish: A Natural History of Electronics. Ann Arbor:
University of Michigan Press.
Glanz, James. 2012. “Power, Pollution and the Internet.” The New York Times, September 22.
http://www.nytimes.com/2012/09/23/technology/data-centers-waste-vast-amounts-ofenergy-belying-industry-image.html?pagewanted=all.
Greenpeace. 2011. “How Dirty Is Your Data?” April 20. http://www.greenpeace.org/international/
en/publications/reports/How-dirty-is-your-data.
Greenpeace Blog. 2011. “Facebook’s New Datacentre: A Renewable-powered Friend?”
October 27. http://www.greenpeace.org/international/en/news/Blogs/Cool-IT/facebooksnew-datacentre-a-renewable-powered-/blog/37552/.
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
16
Television & New Media 16(1)
Greenpeace: Unfriend Coal. 2012. https://www.facebook.com/unfriendcoal.
Greenwall, Glenn. 2013. “The Crux of the NSA Story in one Phrase: ‘Collect it All.’” The
Guardian, July 15. http://www.theguardian.com/commentisfree/2013/jul/15/crux-nsacollect-it-all.
Gruener, Wolfgang. 2012. “Facebook Estimated to Be Running 180,900 Servers.” Toms
Hardware, August 17. http://www.tomshardware.com/news/facebook-servers-powerwattage-network,16961.html.
Hacker News. 2011. “Facebook Is Scaring Me (Scripting.com).” http://news.ycombinator.com/
item?id=3033385.
Hardt, Michael, and Antonio Negri. 2000. Empire. Cambridge, MA: Harvard University Press.
Higginbotham, Stacey. 2012. “When Facebook Goes Down, the Internet Barely Blinks.”
GigaOM, June 1. http://gigaom.com/2012/06/01/when-facebook-goes-down-the-internetbarely-blinks/.
Hill, Kashmir. 2011. “Facebook Keeps a History of Everyone Who Has Ever Poked you,
along with a Lot of Other Data.” Forbes, September 27. http://www.forbes.com/sites/
kashmirhill/2011/09/27/facebook-keeps-a-history-of-everyone-who-has-ever-poked-youalong-with-a-lot-of-other-data/.
Kaufman, Leslie. 2010. “You’re ‘So Coal’: Angling to Shame Facebook.” The New York Times,
October 4. http://green.blogs.nytimes.com/2010/09/17/youre-so-coal-trying-to-shame-facebook.
Lanier, Jared. 2006. “The Hazards of the New Online Collectivism.” Edge.org, May 30. http://
www.edge.org/documents/archive/edge183.html.
Laycock, Holly. 2012. “Crook County.” Portland State Vanguard, February 15. http://psuvanguard
.com/opinion/crook-county.
Leber, Jessica. 2012. “Facebook’s New Power Player.” MIT Technology Review, April 12.
http://www.technologyreview.com/news/427482/facebooks-new-power-player.
Lessig, Lawrence. 2004. Free Culture: How Big Media Uses Technology and the Law to Lock
Down Culture and Control Creativity. London: Penguin.
Lessig, Lawrence. 2007. “TED Talks: ‘How Creativity Is Being Strangled by the Law?’” http://
www.ted.com/talks/larry_lessig_says_the_law_is_strangling_creativity.html.
Locke, Laura. 2007. “The Future of Facebook.” TIME, July 17 http://www.time.com/time/
business/article/0,8599,1644040,00.html.
Lovink, Geert. 2003. “Archival Rumblings: Interview with German Media Archeologist
Wolfgang Ernst.” Nettime.org, February 25. http://www.nettime.org/Lists-Archives/
nettime-l-0302/msg00132.html.
Mann, Juliet, and Neil Curry. 2012. “Scandinavian Cold Could Create Data Storage Hotspot.”
CNN, February 24. http://edition.cnn.com/2012/02/23/business/data-storage/index.html.
Marshall, Tara C. 2012. “Facebook Surveillance of Former Romantic Partners: Associations
with PostBreakup Recovery and Personal Growth.” Cyberpsychology, Behavior, and
Social Networking 15 (10): 521–26. doi:10.1089/cyber.2012.0125
Martin, Chris, Mark Chediak, and Ken Wells. 2013. “Why the U.S. Power Grid’s Days
Are Numbered.” Bloomberg Buisnessweek, August 22. http://www.businessweek.com/
articles/2013-08-22/homegrown-green-energy-is-making-power-utilities-irrelevant.
Maxwell, Richard, and Toby Miller. 2012. Greening the Media. New York: Oxford University Press.
Mayer, Vicki. 2012. “Through the Darkness: Musings on New Media.” Ada: A Journal of
Gender, New Media, and Technology (Issue 1). doi:10.7264/N3CC0XMD
Miller, Rich. 2011. “Facebook Cuts Back on Generators in Sweden.” Data Knowledge Center,
October 31. http://www.datacenterknowledge.com/archives/2011/10/31/facebook-cutsback-on-generators-in-sweden.
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
17
Hogan
Murray, Laura J. 2004. “Protecting Ourselves to Death: Canada, Copyright, and the Internet.”
First Monday 9 (10). doi:10.5210/fm.v9i10.1179
O’Brien, Kevin J. 2012. “Austrian Law Student Faces Down Facebook.” The New York
Times, February 5. http://www.nytimes.com/2012/02/06/technology/06iht-rawdata06.
html?pagewanted=all&_r=0.
O’neill, Nick. 2011. “Surprise, Surprise: Greenpeace Hates Facebook’s New Energy Initiative.”
AllFacebook—The Unofficial Facebook Blog, April 7. http://allfacebook.com/surprisesurprise-greenpeace-hates-facebooks-new-energy-initiative_b38191.
Parikka, Jussi. 2011. Medianatures: The Materiality of Information Technology and Electronic
Waste. JISC. http://livingbooksaboutlife.org/pdfs/bookarchive/Medianatures.pdf.
Parikka, Jussi. 2012. “Dust Theory: Media Matters as Ecology.” Paper presented at the Annual
Conference for the Canadian Communication Association, Wilfrid Laurier University and
the University of Waterloo, Ontario, Canada, May 30–June 1.
Parikka, Jussi. 2013. “An Alternative Deep Time of the Media: A Geologically Tuned Media
Ecology.” http://vimeo.com/72676524.
PC Help Forum. 2012. “Internet and Email.” March 27. http://www.pchelpforum.com/xf/
threads/facebook-eating-bandwidth.131510/.
Protalinski, E. 2011. “Facebook Accounts for 1 in Every 7 Online Minutes.” ZDNET,
December 27. http://www.zdnet.com/blog/facebook/facebook-accounts-for-1-in-every7-online-minutes/6639.
Protalinski, Emil. 2012. “Facebook Got 9% of All US Internet Visits in April.” ZDNET,
May 16. http://www.zdnet.com/blog/facebook/facebook-got-9-of-all-us-internet-visitsin-april/13239.
Rossiter, Ned. 2012. “Dirt Research.” In Depletion Design: A Glossary of Network Ecologies,
edited by Carolin Wiedemann and Soenke Zehle, 47–52. http://xmlab.org/fileadmin/hbk/
download/pdf/maad/wiedemann_zehle_2012_depletiondesign.pdf.
Sengupta, Somini. 2013. “Facebook Unveils a New Search Tool.” The New York Times, January
15. http://bits.blogs.nytimes.com/2013/01/15/facebook-unveils-a-new-search-tool/.
Shepherd, Tamara. 2012. “Persona Rights for User-Generated Content: A Normative Framework
for Privacy and Intellectual Property Regulation.” tripleC—Cognition, Communication,
Co-operation 10 (1): 100–13.
Shoebridge, Gavin. 2012. “How Do I Delete My Facebook History? Easy, Here’s How.”
April 29. http://www.gavinshoebridge.com/news/how-do-i-delete-my-facebook-historyeasy-heres-how.
Spivak, Gayatri Chakravorty. 1999. A Critique of Postcolonial Reason: Toward a History of the
Vanishing Present. Cambridge, MA: Harvard University Press.
Srinivasan, Ramesh. 2009. “Considering How Digital Culture Enables a Multiplicity of
Knowledges.” https://vimeo.com/5520100.
Stalder, Felix. 2008. “Copyright Dungeons and Grey Zones.” The Mail Archive, April 15. http://
www.mail-archive.com/nettime-l@kein.org/msg00613.html.
Takahashi, Tess L. 2007. “An Archive for the Future: The Imaginary Archive: Current Practice.”
Camera Obscura: Feminism, Culture, and Media Studies 22 (3 66): 179–84.
Teicher, Jordan G. 2012. “The Brain of the Beast: Google Reveals the Computers behind
the Cloud.” All Tech Considered, October 17. http://www.npr.org/blogs/alltechconsidered/2012/10/17/163031136/the-brain-of-the-beast-google-reveals-the-computers-behindthe-cloud.
Terranova, Tatiana. 2004. Network Culture: Politics for the Information Age. London: Pluto.
Van Allen, F. 2012. “Facebook’s New Data Storage Center Is a Power Hog.” http://www.wopular.com/facebooks-new-data-storage-center-power-hog.
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
18
Television & New Media 16(1)
Van Camp, Jeffrey. 2010. “Facebook: 25 pct of U.S. Traffic and 100+ Million App Downloads.”
Digital Trends, November 22. http://www.digitaltrends.com/mobile/facebook-25-pct-of-us-traffic-and-100-million-app-downloads/#ixzz2B2OdFikA.
Vance, Ashlee. 2012. “Facebook: The Making of 1 Billion Users.” Bloomberg Buisnessweek,
October 4. http://www.businessweek.com/articles/2012-10-04/facebook-the-making-of1-billion-users.
Walton, Jon. 2012. “Green Norwegian Data Center Cooled by Fjords.” Digital Construction,
March 22. http://www.constructiondigital.com/green_building/green-norwegian-data-centercooled-by-fjords.
Wehner, Mike. 2012. “Timeline Mandatory Rollout: You Have 7 Days to Scour Your Past.”
http://news.yahoo.com/blogs/technology-blog/facebook-timeline-mandatory-rollout7-days-scour-past-185456598.html.
Author Biography
Mél Hogan is a postdoctoral fellow in digital curation in the Department of Journalism and
Mass Communication at the University of Colorado, Boulder. Her research to explore the failures of the (promise of the) archive, feminist media archaeologies, and the politics of storage
and memory. She is also the art director of online and print-on-demand journal of arts and politics, nomorepotlucks.org; on the Advisory Board of the Fembot collective; on the Administrative
Board of Studio XX; a Research Design Consultant for archinodes.com; and a Faculty Fellow
at the Media Archaeology Lab.
Downloaded from tvn.sagepub.com at SHIH HSIN UNIV LIBRARY on December 27, 2014
541552
research-article2014
TVNXXX10.1177/1527476414541552Television & New MediaAndrejevic and Burdon
Article
Defining the Sensor Society
Television & New Media
2015, Vol. 16(1) 19­–36
© The Author(s) 2014
Reprints and permissions:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/1527476414541552
tvnm.sagepub.com
Mark Andrejevic1 and Mark Burdon2
Abstract
The proliferation of embedded and distributed sensors marks the increasing passiveication of interactivity. Devices such as smart phones, cameras, drones, and a growing
array of environmental sensors (both fixed and mobile) and interactive online
platforms have come to permeate daily life in technologically equipped societies.
Consequently, we are witnessing a shift from targeted, purposeful, and discrete forms
of information collection to always-on, ubiquitous, opportunistic ever-expanding
forms of data capture. The increased use of sensors marks important changes to our
understandings of surveillance, information processing, and privacy. In this article,
we explore the transformations associated with the emerging sensing environment.
The notion of a sensor society provides a conceptual basis for understanding the
characteristics of emerging forms of monitoring and control.
Keywords
data mining, surveillance, privacy, power, sensors, smart phones
From Interactivity to Sensing
A top sales executive at the Ford Motor Company caused a stir at Las Vegas’s highly
publicized annual Consumer Electronics Show in 2014 when he announced that,
thanks to embedded devices in his company’s cars, “We know everyone who breaks
the law; we know when you’re doing it . . . We have GPS in your car, so we know what
you’re doing” (Edwards 2014, para. 3). Although he later qualified that claim with the
assurance that the data are only used with customer “approval or consent” (presumably via a lengthy and obscure “terms of use” agreement), he highlighted an important
aspect of a growing array of networked digital devices: they passively collect enormous amounts of data that have wide-ranging potential applications in realms from
1Pomona
College, Claremont, CA, USA
of Queensland, Brisbane, Australia
2University
Corresponding Author:
Mark Andrejevic, Department of Media Studies, Pomona College, 333 North College Way, Claremont,
CA, 91711, USA.
Email: mark.andrejevic@pomona.edu
Downloaded
from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
Downloaded from
http://www.elearnica.ir
20
Television & New Media 16(1)
marketing to law enforcement and beyond (Sparkes 2014, para. 2). Automobile insurance companies are already using “black boxes” that track driving habits in exchange
for discounted rates: “Drive ‘well’ and you’ll keep your discount. Drive poorly and
you could see it disappear” (Cooper 2012, para. 4). One marketing company has
installed a different type of “black box” in businesses throughout downtown Toronto
that tracks mobile phones via the unique identification they send to Wi-Fi networks.
The result is that, unbeknownst to the phones’ owners, their shopping patterns, dining
preferences, and clubbing habits are collected, stored, and shared with participating
businesses: “The company’s dense network of sensors can track any phone that has
Wi-Fi turned on, enabling the company to build profiles of consumers’ lifestyles”
(Dwoskin 2014, B1).
These are just two examples of the ways in which forms of pervasive, always-on,
passive information collection are coming to characterize the use of digital devices
and the business models with which they are associated. If, once upon a time, the
mobilization of the promise of interactivity was characterized by the enthusiastic portrayal of heightened forms of active participation on the part of users, the automated
collection of data “passive-izes” this interactivity. These days, we generate more than
we participate—and even our participation generates further and increasingly comprehensive “meta”-data about itself. Our cars, phones, laptops, Global Positioning System
(GPS) devices, and so on allow for the comprehensive capture of the data trails users
leave as they go about the course of their daily lives. In the business world, this devicedriven data—combined with new techniques for putting it to use—have been enthusiastically greeted as a valuable economic resource: described as the “new oil,” it is
treated as a resource to be extracted, refined, and put to use (Deutscher 2013, para. 3).
The familiar moniker of “big data” is a direct result of proliferating forms of “interactive” data capture because it refers to the burgeoning reserves of data generated by a
growing array of sensors and made available for various forms of sorting, sharing, and
data mining.
In this regard, the rise of “big data,” the fascination with the figure of the “data
scientist,” the development of new forms of data analytics, and the “passive-ication”
of interactivity are interlinked via increasingly powerful and comprehensive sensing
devices and networks. We propose the concept of the “sensor society” as a useful way
of approaching these interconnections and exploring their societal significance. The
term is meant, in the first instance, to refer to a world in which the interactive devices
and applications that populate the digital information environment come to double as
sensors. In many instances, the sensing function eclipses the “interactive” function in
terms of the sheer amount of information generated. For example, the amount of data
that a smart phone generates about its user in a given day is likely to far surpass the
amount of data actively communicated by its user in the form of text messages,
e-mails, and phone calls (not least because each of these activities generates further
data about itself: where the text was sent, how long the call lasted, which websites
were visited, and on and on).
But the notion of a “sensor society” also refers to emerging practices of data collection and use that complicate and reconfigure received categories of privacy,
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
Andrejevic and Burdon
21
surveillance, and sense-making. Finally, the notion of the sensor society is meant to
direct attention toward the costly infrastructures that enable data collection, storage,
and processing as well as to the advantages that flow to the institutions that own, operate, and access them. There are structural asymmetries built into the very notion of a
sensor society insofar as the forms of actionable information it generates are shaped
and controlled by those who have access to the sensing and analytical infrastructure.
Some of the main attributes of an emerging sensor society include the following: the
increasing deployment of interactive, networked devices as sensors; the resulting
explosion in the volume of sensor-generated data; the consequent development and
application of data mining and machine learning techniques to handle the huge
amounts of data; and the ongoing development of collection, storage, and analytical
infrastructures devoted to putting to use the sensor-derived data.
Viewed through the lens of the “sensor” society, conceptions of interactivity and
notions of privacy and power appear in a somewhat different light than in recent celebrations and critiques of digital media. Database-generated forms of “knowledge”
that are “too big to know” (Weinberger 2011, 1) are not accessible in the way that other
forms of knowledge are. As we shall argue, data mining privileges those with access
to the data and the technology when it comes to generating actionable information that
may be neither fully explicable (in the sense of being illuminated by an underlying
explanation) nor reverse-engineerable. In the following sections, we consider in
greater detail the significance of these characteristics of the emerging sensor society
and their implications for new forms of data collection, monitoring, and surveillance.
The Rise of Sensors
Any networked interactive device can double as a sensor insofar as it collects and
relays data about how it is used, and these data can be used to infer information about
the user and/or the user’s environment. For a smart phone, for example, to provide
accurate and continuous location awareness, the device has to connect to a variety of
local Wi-Fi access points (or cellular network towers) while in transit. The transmission of these data not only enables the device’s functionality, but it also means that the
device can double as a sensor, and there are a growing range of apps that can be used
to collect data about users and their activities (Dwoskin 2014, paras. 1ff). This logic is
generalizable across the digital landscape: devices and applications developed for one
purpose generate information that can be repurposed indefinitely. For example, the
scanners that allow cashiers to enter prices more rapidly can also be used to track the
speed at which employees work; digital video recorders capture data about viewing
habits (including paused and fast-forwarded moments); e-readers capture data about
when and where a book is read, which passages or pages are skipped; Facebook
recently announced a mobile app that uses the microphones in smart phones to detect
nearby music or TV shows (Makarechi 2014); and so on.
Sensing technologies and apps for the smart phone industry alone have spawned a
rapidly expanding market as new sensing frontiers unfold. For example, the U.S.
Department of Homeland Security has funded a program to develop smart phone
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
22
Television & New Media 16(1)
sensors that can detect toxic chemicals in the air to provide an early warning system
for industrial accidents or terrorist attacks. Smart phone users would, in effect, become
distributed mobile sensors automatically relaying data back to the Department of
Homeland Security (DHS 2013) about air quality. By the same token, employers
increasingly rely on a range of sensors to monitor workers: keystroke monitoring software, smart cards that track employee movements, GPS devices that monitor drivers
and delivery personnel, and even applications that track employees’ facial expressions
(Waber 2013).
Researchers at MIT have even developed wearable monitoring devices called
“sociometers” that automatically track “the amount of face-to-face interaction, conversational time, physical proximity to other people, and physical activity levels”
among workers to “measure individual and collective patterns of behavior, predict
human behavior from unconscious social signals, identify social affinity among individuals working in the same team, and enhance social interactions” (MIT Media
Laboratory 2011). Even employee recruitment practices are being sensorised. A company called Evolv that mines large sets of recruitment and workplace data reported as
one of its key findings that “people who fill out online job applications using Web
browsers that did not come with the computer . . . but had to be deliberately installed
(like Firefox or Google’s Chrome) perform better and change jobs less often” (The
Economist 2013). The web browser used to fill out a job application becomes an
important element of the job application itself. As such examples indicate, the Internet
provides a model for the sensor society, insofar as its version of interactivity is one in
which, increasingly, movement through cyberspace generates data that can be collected, stored, and sorted. Digital sensors form an interactive overlay on the physical
spaces they populate, allowing them to become as trackable as the Internet. Thus,
devices like Google Glass, for example, transpose the affordances of cyberspace
(back) into the register of physical space: locations can be tagged and book-marked.
As such applications proliferate, our devices and our environments are likely to
become increasingly populated by sensors in what would once have seemed surprising
ways: car seats with heart-rate monitors, desks with thermal sensors, phones with air
quality monitors, tablets that track our moods, and so on. Once information about our
mood inferred through our facial expressions, body temperature, pulse, and so on can
be collected, a new array of sensors can be developed to respond to these data—and,
in turn, to collect, store, and make sense of the data generated by this response. When
interactive devices are treated as sensors, creative uses for existing data sets can be
developed and new sensing capabilities can be piggy-backed upon existing ones.
Consider, for example, the efforts of Microsoft researchers to develop apps that transform smart phones into “mood sensors” (LiKimWa 2012, 1). Rather than developing
a specific biometric sensor to detect mood (via, say, electroencephalogram [EEG]
readings, skin conductance, voice stress, etc.), the researchers simply tracked the ways
in which users’ self-reported moods correlated with their usage patterns, and then
developed a model that built on these findings to predict mood, reportedly with 94
percent accuracy (LiKimWa 2012, 23ff). As new forms of sensing and data collection
are devised, these are leveraged against already existing data troves that have
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
Andrejevic and Burdon
23
accumulated over years. The sensor-derived data and its collection can be repurposed
indefinitely.
In this regard, sensor-derived data collection is dissimilar to traditional practices of
surveillance even though sensor-related collection activities trigger similar surveillance and monitoring concerns. In their report on “The Surveillance Society” for the
U.K. Information Commissioner, David Murakami Wood and Kirstie Ball propose a
preliminary definition of surveillance as “purposeful, routine, systematic and focused
attention paid to personal details, for the sake of control, entitlement, management,
influence, or protection” (Wood and Ball 2006, 1). They further emphasize that “surveillance is also systematic; it is planned and carried out according to a schedule that
is rational, not merely random” (Wood and Ball 2006, 3). Similarly, in his influential
formulation of “dataveillance,” Roger Clarke refers to “the systematic monitoring of
people or groups, by means of personal data systems, in order to regulate or govern
their behaviour” (Clarke 2003, para. 5). Clarke subsequently distinguished between
targeted personal dataveillance and “mass dataveillance, which involves monitoring
large groups of people” (Clarke 2003, para. 7). Although the forms of sensor-based
monitoring associated with interactive media technologies share broadly in these logics of information collection, they also differ in important ways.
If, for example, as Wood and Ball (2006) argue, surveillance is focused and in reference to identifiable persons, this is only partially true of sensor-based forms of monitoring. The goal of sensor-related collection is the capture of a comprehensive portrait
of a particular population, environment, or ecosystem (broadly construed). More systematic forms of targeting start to take place against this background, and increasingly
come to rely on it. The population-level portrait allows particular targets to emerge—
and once they do, their activities can be situated in the context of an ever-expanding
network of behaviors and the patterns these generate. Thus, sensor-derived surveillance can be untargeted, non-systematic, and often opportunistic. Consider, for example, the fact that some U.S. military drones are equipped with a device called an “Air
Handler” that can capture all available wireless data traffic in the area through which
the drone flies. As one of the rare news accounts about this device put it, when a drone
goes out on a mission, “the NSA [National Security Agency] has put a device on it that
is not actually under the control of the CIA or the military; it is just sucking up data for
the NSA” (Goodman 2014). The drone then comes to represent a double-image of
surveillance: both the familiar “legacy” version of targeted, purposeful spying and the
emerging model of increasingly ubiquitous, opportunistic data capture. As one news
account puts it, “the NSA just wants all the data. They want to suck it up on an industrial scale. So they’re just piggybacking on these targeted operations in an effort to just
suck up data throughout the world” (Goodman 2014, para. 8). For drones, the signalsaturated sky is a sea of electromagnetically stored data that can be scooped up, processed, refined, and perhaps put to use.
Such examples highlight the additive, convergent, and intersectional character of
surveillance associated with sensor-based data acquisition. As new sensors come
online, the data they capture can be added to existing databases to generate new patterns of correlation. The goal is not necessarily to follow or track an individual, per se,
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
24
Television & New Media 16(1)
but to capture a specific dimension of activity or behavior across the interactive, monitored space—to open up new data-collection frontiers (mood, gait, typing patterns,
preferred browser, etc.) in an expanding “digital enclosure,” wherein a growing range
of spaces, places, and the information associated with them enter into the monitored
embrace of digital interactivity (Andrejevic 2007). This type of data capture gives new
meaning to the notion of focused monitoring: not exercised upon a particular individual per se but upon a specific dimension or register of activity. Even if individuals are
not the target of the pattern generation process, it becomes easier than ever before to
identify them, sort them, and target them. New sensors open up new dimensions of the
population, environment, or ecosystem. Once these dimensions are developed, they
can be compared with others to generate potentially useful patterns for purposes ranging across a range of activities from politics and policing to health care, employment,
education, marketing, and more. The goal is to broaden the range of monitored dimensions that give shape to the population–environment nexus, allowing it to emerge in
new ways as a site of detection, measurement, analysis, and intervention.
Defining the Sensor Society
Concepts such as “the information society” (Beniger 1986; Webster 2007, among others) and “the surveillance society” (Lyon 2001, among others) have relatively broad
currency in both the media studies literature and popular media discourses, so what
justification might there be for yet another sweeping moniker? The notion of a “sensor
society” clearly not only fits within these broader categories, but it also isolates a
salient aspect of emerging social logics so as to focus attention upon them and their
broader implications for social, cultural, economic, and political life. The notion of a
“sensor society” (Schermer 2008), then, is meant to focus attention on developments
in the collection and use of information in the digital era that might help re-orient discussions about issues ranging from surveillance and power to privacy and social sorting. The frame of the “sensor-society” addresses the shifts that take place when the
once relatively exceptional and discrete character of monitoring becomes the rule, and
when the monitoring infrastructure allows for passive, distributed, always-on data collection. Our hope is that directing attention to the logic of sensing-based monitoring
will open avenues for further exploration of the dimensions of a sensor society in
which the devices we use to work and to play, to access information and to communicate with one another, come to double as probes that capture the rhythms of the daily
lives of persons, things, environments, and their interactions.
In general terms, a sensor is a device that measures or detects an event (such as an
earth tremor or a status update) or state (such as the temperature) and translates this
measurement or detection into a signal: it “responds to stimuli” (the “sensitive element”) and “generates processable outputs” (the “transducer”) that are translated into
“readable signals” by a “data acquisition system” (Kalantar-Zadeh and Wlodarski
2013, 12–22). To view a device as a sensor within the context of the sensor society is
to approach it from a particular angle: to determine what type of information the sensor automatically collects (what it measures or detects), how this information is stored
and shared, and how it can be put to use.
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
Andrejevic and Burdon
25
Sensors can include any device that automatically captures and records data that
can then be transmitted, stored, and analyzed. A keystroke monitoring system on a
computer that can record the unique speed and pattern of an individual’s typing style
is a form of sensor, as is a web browser that can capture and record someone’s Internet
search habits (it detects and transduces). These devices may be much more than sensors, but they partake of the logic of sensing as a form of passive monitoring, and can
be treated as, among other things, components of an increasingly comprehensive,
albeit distributed and often disarticulated sensing apparatus. Some sensors may be
coordinated with others, but others rely on infrastructures that are owned and operated
by distinct entities that do not necessarily share information with one another. Sensors
do not watch and listen so much as they detect and record. They do not rely on direct
and conscious registration on the part of those being monitored. When one sends an
e-mail to someone, one is actively communicating to them, but when a device detects
the details of one’s online activity (including e-mails), sensor-based monitoring is taking place.
Thus, new realms of interactivity open up new dimension of sensing and intervention, as do new technologies and practices. When automated license plate readers and
radio frequency identification (RFID) scanners were developed, it became possible to
trace mobility in new ways. When phones went mobile, they traced new frontiers in
geo-locational monitoring. These monitoring dimensions are further expanded by the
addition of Internet access and other interactive applications. A dedicated sensor is not
necessary to expand the sensing frontier: thanks to data mining techniques, e-mail,
phone activity, or browsing behavior can turn personal devices into mood detectors,
illness monitors, and fitness evaluators. We might divide these developments up into
new technological frontiers in sensing (the development of new forms of dedicated
sensors—location tracking devices, expression detectors, infrared or ultrasound detectors, toxic chemical detectors, etc.) and expanding frontiers in inferential sensing (the
ability to extrapolate information from the data provided by the existing technology
and dedicated sensors—such as inferring mood based on texting and web browsing
activity). In this sense, the data mining process helps to expand the available dimensions of sensing.
The Explosion of Sensor-Derived Data
The shift away from targeted to comprehensive forms of data collection may be
enabled by new, inexpensive, and distributed forms of networked devices; but it is
driven by the logic of emerging forms of data analysis. When the goal is to generate as
much data as possible to discern otherwise inaccessible and undiscernible patterns, it
is impossible to determine in advance the full range of potentially useful types of
information. The goal then becomes to collect as much data as possible in as many
dimensions as are available. Unsurprisingly, then, the amount of data collected on a
daily basis is historically unprecedented but is, nonetheless, a small foretaste of things
to come. IBM (2013, paras. 1ff) claims, for example, that every day, about 2.5 quintillion bytes of data are generated—the data equivalent of a quarter million copies of the
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
26
Television & New Media 16(1)
print collection of the Library of Congress—and that 90 percent of the world’s stored
data have been created in the past two years. That is, if the entirety of recorded human
history were shrunk to the length of a day, the vast majority of its accessible stored
data would have been created in the equivalent of the last thirty seconds. Much of
these data are generated mechanically and automatically by a burgeoning array of
digital sensors that capture not just human activity but climate data, traffic flow,
machine activity, and so on. However, the upshot is that sensor-derived data accumulate faster than human hands can collect it and faster than human minds can comprehend it.
Capturing, storing, and making sense of huge amounts of data are a resource-intensive endeavor, even despite the falling costs of digital storage and processing power.
The costs continue to escalate in part because what counts as “big data” continues to
dramatically increase, and in part because the goal of total information capture is built
into the data mining model. The Central Intelligence Agency’s (CIA) Chief Technology
Officer, Gus Hunt, has described this Google-inspired approach as a paradigm shift for
intelligence agencies insofar as they are moving “away from search as a paradigm to
pre-correlating data in advance to tell us what’s going on” (Hunt 2012). All data are
potentially useful in this framework:
The value of any piece of information is only known when you can connect it with
something else that arrives at a future point in time . . . Since you can’t connect dots you
don’t have, it drives us into a mode of, we fundamentally try to collect everything and
hang on to it forever. (Sledge 2013)
The result is that big data mining remains the preserve of large corporations and wellfunded agencies. What counts as data about “everything” continues to grow as new
forms of sensing and sense-making are developed.
Thus, one of the characteristic challenges for emerging forms of sensor-derived
data collection is the sheer amount of information they generate. For example, when
the avalanche of images generated by U.S. surveillance drones threatened to outstrip
the ability of human observers to make sense of them, out-of-the-box thinkers at the
RAND Corporation turned to a seemingly unlikely source for inspiration and assistance: reality TV producers (Menthe et al. 2012). The latter had extensive experience
in sorting through hours of uneventful tape to isolate a few decisive moments. The
logic uniting drones and reality TV, according to military think tankers, is the need to
rapidly process the large amounts of information generated by twenty-four-hour,
multi-camera video surveillance. As one news account puts it,
. . . when you start thinking about some of these reality shows that have dozens of
cameras, continuously running, and then these producers trying to compartmentalize all
of that and cram it into a 30-minute episode, you start to get an idea of how much they
may have in common with the Air Force. (CNN 2012)
The RAND Corporation report is a meditation on the difficulties posed by the human
bottleneck in processing the tremendous amounts of data generated by sensors. The
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
Andrejevic and Burdon
27
problem is not a new one in the “intelligence” world: signals intelligence in the post–
World War II era has long posed the challenge of information glut: how best to make
sense of the increasingly large amounts of information that can be captured, stored,
and viewed or listened to by intelligence analysts. Recent developments in data mining and analytics indicate that the tendency will be away from human analysis and
toward automated forms of data processing.
The CIA’s Gus Hunt describes the shift toward data mining as one that replaces the
older “search and winnow” model, in which a small portion of useful data is kept and
the rest discarded. Thus, the CIA’s rationale for sweeping up as much data as possible
is representative of the logic permeating predictive analytics: the value of some forms
of information is speculative in the sense that it cannot be determined until further
“data points” arrive. The very possibility of utility warrants collection under conditions in which technological developments make it possible to store more and more
data due to the proliferation of sensors and the explosion of sensor-derived data. Given
the additive and speculative character of data mining (a data set might yield new and
useful patterns when paired with future information), the purpose and justification for
monitoring in the sensor society can come after the fact.
Meta-datafication
The automated capture and storage of data give rise to another important aspect of this
data explosion—what might be described as the process of meta-datafication—the
treatment of content as just another form of meta-data, or (by the same token), the
understanding that the only real content of interest, from a data analytical perspective,
is that which is machine-readable. Consider, for example, Google’s oft-repeated
rejoinder to those who accuse the search-engine giant of disregard for privacy because
of its aggressive information collection and tracking practices: “no humans read your
email or Google Account information” (Byers 2013). Machines do not attempt to
understand content in the way a human reader might. Instead, they scan e-mail and
online behavior for potentially useful patterns. The substance of this rejoinder to privacy concerns is that people should not worry because Google’s machines have transformed the meaningful content of their communications into meta-data: not actual
content but information about the content (what words appear in it, when, where, in
response to whom, etc.).
It is precisely the potential of the automated processing of sensor-derived data that
underwrites the productive promise of data analytics in the sensor society: that the
machines can keep up with the huge volumes of information captured by a distributed
array of sensing devices. Treating the content of e-mail as meta-data is one of the consequences of transforming networked communication devices into sensors that capture the behaviors and communications of users. Accordingly, one of the lessons of the
sensor society is that content can be treated as meta-data, insofar as emphasis on the
ideational content is displaced by the focus on patterns of machine-readable data.
Perhaps this shift is what MIT’s Big Data guru Sandy Pentland is gesturing toward
when he claims that
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
28
Television & New Media 16(1)
. . . the power of Big Data is that it is information about people’s behavior instead of
information about their beliefs . . . It’s not about the things you post on Facebook, and it’s
not about your searches on Google . . . Big data is increasingly about real behavior, and
by analyzing this sort of data, scientists can tell an enormous amount about you. (Edge
2013)
Pentland’s distinction does not hold up: what one posts on Facebook—along with
detailed information about when, where, and how—is a form of behavior, as are one’s
search patterns on Google. What Pentland is really getting at is what might be described
as the vantage point of “big data,” which privileges a perspective that focuses on information as a pattern-generating form of behavior and not as ideational content. Jeremy
Packer (2013, 298) sums up this perspective shift in his description of a model, “pioneered and widely implemented by Google” in which, “the logic of computation is
coming to dominate. In this model, the only thing that matters are directly measurable
results”—what Pentland describes as “behavior.” As Packer (2013, 298) puts it,
Google’s computations are not content-oriented in the manner that advertising agencies
or critical scholars are. Rather, the effect is the content. The only thing that matters are
effects: did someone initiate financial data flows, spend time, consume, click, or conform?
Further, the only measurable quantity is digital data. Google doesn’t and couldn’t measure
ideology.
This shift is what Pentland most likely means when he says that Facebook posts and
search requests are not of interest. That is, they are not of interest from an ideational
perspective. As behavior, of course, they help provide valuable data. The messages
themselves, when read by the machine, become, in a sense, contextual information
about themselves (and users) even when they are isolated from the ideational content
of the message to a particular receiver.
The notion that the collection of meta-data is somehow less powerful or intrusive
than that of the content with which they are associated has come under considerable
scrutiny (Narayanan and Shmatikov 2010; Ohm 2010). Former Sun Microsystems
engineer Susan Landau, for example, confided to the New Yorker magazine that the
public “doesn’t understand” that meta-data is “much more intrusive than content”
(Mayer 2013, para. 22). It is possible to unearth intimate details about individuals
without having a human actually read their communications. Knowing where people
go at what times of day, whom they communicate with, and so on, can reveal a lot
about them, including sensitive information about their health, their political inclinations, and their private lives.
It should come as no surprise that, from a privacy perspective, the process of metadatafication erodes the concept of information privacy and the laws that flow from that
concept. Different definitions exist as to what constitutes personal information, but
typically information privacy law deals with information that can be used to identify
an individual. Personal information can therefore be specific data or combinations of
data that can identify individuals directly, such as full name, driver’s licenses, or social
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
Andrejevic and Burdon
29
security numbers, but it can also include data that indirectly identifies individuals. For
example, a residential address can be used to aggregate different sets of data that facilitate identification. The legal definitions of personal information recognize that the
nature of personal information generation is inherently contextual. Information can
become personal information in different contexts, at different times, and in different
social relationships.
The logic of the sensor society envisions the prospect that individuals will be
uniquely identifiable from the meta-data created by sensor devices and sensor networks. That is, seemingly anonymous information such as patterns of movement or
online search behavior, or even unique typing patterns, can give rise to the identification of unique individuals, especially in an environment where more and more sensors
collect a growing range of data. As data from different sensors is combined and mined,
it is possible to infer further information about such individuals—including details that
would, in other contexts, fall into protected categories—without needing to know their
names or their addresses. However, given the ease with which these data can eventually be traced back to named individuals by drawing upon combinations of databases,
all data about persons harvested from increasingly comprehensive sensor networks are
likely to become, for all practical purposes, personally identifiable.
The Search for Un-anticipatable Patterns
Because the proliferation of sensors underwrites the recent explosion of digitally
stored data and pushes necessarily in the direction of automated data processing, the
forms of knowledge generated by automated forms of data mining become characteristic of a sensor society. These forms of knowledge rely upon emergent processes in
the sense that their goal is to generate un-anticipatable and un-intuitable correlations:
that is, patterns that cannot be predicted in advance. Thus, the imperative for more data
is not simply a result of the desire to gain as complete a record as possible of populations and environments but also of the data mining process itself: un-anticipated or
un-intuitive results can be generated by adding in new categories of data, even seemingly irrelevant data. For example, the fact that browser selection correlates with job
performance is not something that employers would be likely to anticipate—it is an
artifact of the data mining process, which included a consideration of variables not
traditionally associated with job interviews but made available through the mechanics
of the online job recruitment process. The data miners used the information because it
was available to them—part of the trove of information collected during the application process but not intentionally incorporated into that process. There is a rationale to
this kind of monitoring, but it is neither systematic nor targeted. Analysts do not start
out with a model of the world that they are setting out to prove or disprove, like a
detective trailing suspects, but with a trove of information. This trove is shaped by the
available sensing technology, much of which is, in turn, the result of affordances built
into devices, networks, and applications for a range of reasons that might initially have
little to do with the goals of those who seek to put the data to use.
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
30
Television & New Media 16(1)
The Ongoing Quest for Diachronic Omniscience
Predictive analytics and related forms of data mining extend into the future the quest
for what Lisa Parks (2005, 91) presciently calls “diachronic omniscience.” Parks uses
the term to describe the forms of comprehensive information capture associated with
satellite-based forms of monitoring: “the capacity of media to comprehensively record
global space through time” (as paraphrased by Russill 2013, 102). The hope on the
part of data miners is that comprehensive data about what happened can be used to
project into the future. However, as Parks has demonstrated in her discussion of satellite imaging, when the data from sensors accumulate, they can be used not only to
model the future but also to mine the past. Consider, for example, the use of digital
records to link suspects to crime scenes. Police have already used mobile phone data
to catch thieves by placing them at the scene of the crime and reconstructing their
movements in a subsequent car chase (Perez and Gorman 2013). The goal of “diachronic omniscience” invokes the possibility of a complete archive that could supplement the vagaries of reported actions and memories by externalizing them in the form
of machine-readable databases. The related claim to the repeated (but highly contestable) refrain that we need not worry about new forms of data collection as long as we
are not doing anything wrong is that the database can provide us with alibis.
Alternatively, for those who are guilty, the archive can be used to link them to the
scene of a crime, to reconstruct their movements, to identify, and to eventually capture
them.
Any attempt to approach so-called “diachronic omniscience” necessarily entails the
formation of databases large enough to re-duplicate the world in informational form
and the development of analytic tools to make sense of these data. The issue of infrastructure is accordingly central to these examples and thus to any critical consideration
of the sensor society.
Jeremy Packer (2013, 297) captures something of this logic in his echo of the
Kittlerian call to attend to infrastructure:
Understanding media not merely as transmitters—the old “mass media” function—but
rather as data collectors, storage houses, and processing centers, reorients critical
attention toward the epistemological power of media . . . Media forge real power/
knowledge relationships that reassemble the world.
By contrast, the airy rhetoric of “cloud computing” and various notions of “immateriality” that have been associated with digital, post-industrial forms of production
and consumption represent what might be described as a turn away from infrastructure
in both popular and academic discussions of digital, networked media. Not that long
ago, brand-name futurists including Esther Dyson and Alvin Toffler proclaimed the
“central event of the 20th Century” to be the “overthrow of matter”—and along with
it allegedly anachronistic preoccupations with property, hardware, and infrastructure
(Dyson et al. 1996, para. 1). Even Hardt and Negri’s (2009, 294) conception of “immaterial labor” pushes in the direction of imagining a “self-valorizing” productivity
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
Andrejevic and Burdon
31
unfettered from the constraints of fixed capital: “Today, productivity, wealth, and the
creation of social surpluses take the form of cooperative interactivity through linguistic, communicational, and affective networks.” The tendency of such formulations is
to direct attention toward particular types of expressive and communicative activity
and away from the often privately owned and opaque infrastructures upon which they
rely.
The notion of the sensor society, by contrast, redirects attention toward the infrastructures that make data collection capture, storage, and processing possible and consequently to the relations of ownership and control that shape who has access to data
and who sets the parameters and priorities for using that data. Consider, for example,
an account of the frustration evinced by one of the generals who helped oversee the
development of the Predator drone (one of the more highly publicized technological
icons of the sensor society): “he has grown so weary of fascination with the vehicle
itself that he’s adopted the slogan ‘It’s about the datalink, stupid’” (Bowden 2013,
para. 12). The drone, like the sensors distributed across the networked digital landscape, is “a conduit”: “Cut off from its back end, from its satellite links and its data
processors, its intelligence analysts and its controller, the drone is as useless as an
eyeball disconnected from the brain” (Bowden 2013, para. 12). In other words, the
sensor is inextricably related to the communication and analytical infrastructure upon
which it relies. Sensors can, of course, operate at close range, such as the devices that
detect whether a smart phone is in bright light or close to someone’s head. However, it
is when these data can be captured, stored, and shared—that is, when the sensors are
articulated to the infrastructures for data collection and analysis (and eventual
response)—that the salient characteristics of the sensor society emerge.
Making Sense of the Sensor Society
The proliferation of sensors pushes in the direction of automation: not simply in the
data collection process but in data analytics and response. Because the sensing process
is not discrete, but continuous, and because the target is not a particular individual or
moment but what might be described as a defined dimension (and any event that takes
place in that dimension), the data accumulates indefinitely. In broader terms, the additive goal behind the proliferation of sensors can be understood to be the digital replication of entire populations and environments enabled by a variety of distinct but
interconnected infrastructures. Individual targets are treated as pieces of a puzzle. All
of them must be included for the puzzle to be complete, but the picture is not of them
or about them, per se, but about the patterns their data form in conjunction with that of
others. In the sensor society, the target is the pattern and the pattern is an emergent one
(insofar as it cannot be detected until the analysis process is undertaken).
Conventional understandings of privacy as control over one’s self-disclosure and
self-presentation are complicated by this reconfiguration of targeting toward patterns
rather than people and especially by the emergent character of pattern generation. The
turn toward automated forms of predictive analytics means that it is, by definition,
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
32
Television & New Media 16(1)
impossible to reasonably anticipate the potential uses of the information one discloses.
The goal of data mining large quantities of information is, by definition, to generate
un-anticipatable and un-intuitable predictive patterns (see, for example, Chakrabarti
2009). That is, the data analytic process is systemically and structurally opaque. It follows that data collection and analytical infrastructures are equally opaque. The legal
theorist Tal Zarsky (2013, 1519) describes the decisions based on such data mining
processes as “non-interpretable” (and thus non-transparent) because of their inherent
complexity:
A non-interpretable process might follow from a data-mining analysis which is not
explainable in human language. Here, the software makes its selection decisions based
upon multiple variables (even thousands).
As such, processes of opacity that yield un-anticipated uses for data that result in uninterpretable decisions undermine some of the key foundations of information privacy
law, namely, informed consent and even ideas such as contextual integrity (Nissenbaum
2010). To the extent that the ongoing generation of un-anticipated uses becomes the
norm, the norms lose regulatory purchase: they do not rule out any particular use in
advance. The search for unpredictable and otherwise indiscernible correlations means
that so-called “function creep” is not ancillary to the data collection process but is built
into it: the function is the creep. Increasingly, all data need to be treated as personal
data in the sensor society because any given piece of data, aggregated with other available databases for the purpose of predictive pattern generation, could have the capacity
to identify an individual but more importantly could be used in a way that impacts on
their life chances.
Neither the concept of information privacy law nor anti-discrimination law is
designed to cope with the vastness of data collection and analysis envisioned by the
sensor society. All data simply cannot be personal information under the rubric of
information privacy law. All decisions of exclusion cannot be discriminatory under
anti-discriminatory law. Quite simply, the legal systems created around these concepts
would fail to operate if that was the case. Regulation of the sensor society thus poses
a new set of legal challenges (Cohen 2012).
Underwriting these observations is the recognition that the sense-making processes
and the sensor technology must be considered in conjunction with one another. The
sensor society we are describing is inseparable from both its back-end infrastructure
and from the logics of sensor-driven data analysis and response. The ability to collect
large amounts of data becomes associated with new forms of sense-making (that rely
on capturing as much information as possible and on predicting and correlating rather
than explaining or understanding). Big data mining approaches push in the direction
of more comprehensive data collection and thus embrace the imperative of developing
more comprehensive sensing networks. Thus, the invocation of the notion of a sensor
society looks beyond the ephemeral construct of “big data” to invite a critical interrogation of the power structures that shape the development and use of sensing and
sense-making infrastructures.
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
Andrejevic and Burdon
33
To forward the notion of a “sensor society” is not to posit the wholesale transformation of all forms of information capture, processing, and use. We do not seek to contest
critical claims about surveillance in the digital era, so much as to add a further dimension—albeit one that we argue is unique and significant. Nor do we claim to have
exhaustively described the sensor society—which is an emerging phenomenon—but
we do hope that by defining a particular perspective, we have opened up avenues for
further exploration, both conceptual and empirical. Not all the attributes we describe
as characteristic of a sensor society are unique to it, and yet, we argue that their combination is unique and significant and that current popular, academic, and regulatory
discourses have not yet caught up with them or taken them fully into account. Our
hope is that in outlining the notion of a sensor society, we have highlighted some key
issues related to surveillance, monitoring, privacy, and control for the foreseeable
future. We anticipate that the study of what might be described as the cultural, social,
political, economic, and technological logics of the sensor society will become an
increasingly pressing concern as interactive devices proliferate and become equipped
with a growing array of increasingly powerful sensors. It is the task of those who seek
to understand these developments to ensure that their theoretical, conceptual, and critical formulations keep pace with the technology and its deployment.
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.
Funding
The authors disclosed receipt of the following financial support for the research, authorship,
and/or publication of this article: One of the authors was supported by a research grant:
Australian Research Council Discovery Project Grant (DP1092606).
References
Andrejevic, Mark. 2007. iSpy: Power and Surveillance in the Interactive Era. Lawrence:
University of Kansas Press.
Beniger, James R. 1986. The Control Revolution: Technological and Economic Origins of the
Information Society. Cambridge, MA: Harvard University Press.
Bowden, Mark. 2013. “The Killing Machines: How to Think about Drones.” The Atlantic,
August 14. http://www.theatlantic.com/magazine/archive/2013/09/the-killing-machineshow-to-think-about-drones/309434/ (accessed May 15, 2014).
Byers, Alex. 2013. “Microsoft Hits Google Email Privacy.” Politico.com, February 7.
http://www.politico.com/story/2013/02/microsoft-renews-google-attack-on-email-privacy-87302.html (accessed May 15, 2014).
Chakrabarti, Soumen. 2009. Data Mining: Know it All. New York: Morgan Kaufmann.
Clarke, Roger. 2003. “Dataveillance—15 Years On.” Personal Website. http://www.roger​
clarke.com/DV/DVNZ03.html (accessed May 15, 2014).
CNN. 2012. “The Situation Room.” September 10. Transcript retrieved online at: http://edition.
cnn.com/TRANSCRIPTS/1209/10/sitroom.02.html.
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
34
Television & New Media 16(1)
Cohen, Julie. 2012. Configuring the Networked Self: Law, Code and the Play of Everyday
Practice. New Haven: Yale University Press.
Cooper, Charlie. 2012. “Backseat Big Brother.” The Independent, August 9. http://www
.independent.co.uk/life-style/motoring/features/backseat-big-brother-is-the-insurancecompanies-black-box-worth-it-8022694.html (accessed May 15, 2014).
Department of Homeland Security. 2013. “Cell-All: Super Smartphones Sniff Out Suspicious
Substances,” Official Website. http://www.dhs.gov/cell-all-super-smartphones-sniff-outsuspicious-substances (accessed May 15, 2014).
Deutscher, Maria. 2013. “IBM’s CEO Says Big Data Is Like Oil, Enterprises Need Help
Extracting the Value.” Silicon Angle, March 11. http://siliconangle.com/blog/2013/03/11/
ibms-ceo-says-big-data-is-like-oil-enterprises-need-help-extracting-the-value/ (accessed
May 15, 2014).
Dwoskin, Elizabeth. 2014. “What Secrets Your Phone Is Sharing about You—Businesses Use
Sensors to Track Customers, Build Shopper Profiles.” Wall Street Journal, January 14, B1.
Dyson, Esther, George Gilder, George Keyworth, and Alvin Toffler. 1996. “Cyberspace and
the American Dream: A Magna Carta for the Knowledge Age.” The Information Society
12 (3): 295–308.
Edge. 2013. “Reinventing Society in the Wake of Big Data: A Conversation with Sandy
Pentland.” August 30, 2012. http://www.edge.org/conversation/reinventing-society-in-thewake-of-big-data (accessed May 15, 2014).
Edwards, Jim. 2014. “‘We Know Everyone Who Breaks the Law’ Thanks to Our GPS in Your
Car.” Business Insider (Australia), January 9. http://www.businessinsider.com.au/fordexec-gps-2014-1 (accessed May 15, 2014).
Goodman, Amy. 2014. “Death by Metadata: Jeremy Scahill and Glenn Greenwald Reveal
NSA Role in Assassinations Overseas.” Democracy Now! (radio program), February 10.
Transcript retrieved online at: http://www.democracynow.org/2014/2/10/death_by_metadata_jeremy_scahill_glenn (accessed May 15, 2014).
Hardt, Michael, and Antonio Negri. 2009. Empire. Cambridge, MA: Harvard University Press.
Hunt, Gus. 2012. “Big Data: Operational Excellence Ahead in the Cloud,” Presentation to the
Amazon Web Services Government Summit 2011, Washington, DC, October 26. http://
www.youtube.com/watch?v=SkIhHnoPpjA (accessed May 15, 2014).
IBM. 2013. “The IBM Big Data Platform.” IBM Software Group (Web Page). http://public.
dhe.ibm.com/common/ssi/ecm/en/imb14135usen/IMB14135USEN.PDF (accessed May
15, 2014).
Kalantar-Zadeh, Kourosh, and Wojciech Wlodarski. 2013. Sensors: An Introductory Course.
New York: Springer.
LiKimWa, Robert. 2012. “MoodScope: Building a Mood Sensor from Smartphone Usage
Patterns” (Doctoral dissertation, Rice University, Houston, TX).
Lyon, David. 2001. Surveillance Society. Buckingham: Open University Press.
Makarechi, Kia. 2014. “Facebook Knows What Music You’re Listening To.” Vanity Fair, May
22.
http://www.vanityfair.com/online/daily/2014/05/facebook-listens-music-tv-showsshare (accessed May 28, 2014).
Mayer, Jane. 2013. “What’s the Matter with Metadata.” The New Yorker, June 6. http://www.
newyorker.com/online/blogs/newsdesk/2013/06/verizon-nsa-metadata-surveillance-problem.html (accessed September 2, 2013).
Menthe, Lance, Amado Cordova, Carl Rhodes, Rachel Costello, and Jeffrey Sullivan. 2012.
“The Future of Air Force Motion Imagery Exploitation: Lessons from the Commercial
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
Andrejevic and Burdon
35
World.” The RAND Corporation: Project Air Force. http://www.rand.org/content/dam/
rand/pubs/technical_reports/2012/RAND_TR1133.pdf (accessed March 15, 2014).
MIT Media Laboratory. 2011. “Sociometric Badges.” http://hd.media.mit.edu/badges/ (accessed
May 15, 2014).
Narayanan, Arvind, and Vitaly Shmatikov. 2010. “Myths and Fallacies of Personally Identifiable
Information.” Communications of the ACM 53 (6): 24-26.
Nissenbaum, Helen. 2010. Privacy in Context: Technology, Policy, and the Integrity of Social
Life. Stanford: Stanford Law Books.
Ohm, Paul. 2010. “Broken Promises of Privacy: Responding to the Surprising Failure of
Anonymization.” UCLA Law Review 57 (6): 1701–77.
Packer, Jeremy. 2013. “Epistemology Not Ideology OR Why We Need New Germans.”
Communication and Critical/Cultural Studies 10 (2–3): 295–300.
Parks, Lisa. 2005. Cultures in Orbit: Satellites and the Televisual. Durham: Duke University
Press.
Perez, Evan, and Siobhan Gorman. 2013. “Phones Leave a Telltale Trail.” The Wall Street
Journal, June 15. http://online.wsj.com/article/SB10001424127887324049504578545352
803220058.html (accessed September 2, 2013).
Russill, Chris. 2013. “Earth-Observing Media.” Canadian Journal of Communication 38 (3):
95–116.
Schermer, Bart. 2008. “Privacy and Visibility in the Sensor Society.” SlideShare. http://www.
slideshare.net/Considerati/privacy-and-visibility-in-the-sensor-society (accessed 15 May,
2014).
Sledge, Matt. 2013. “CIA’s Gus Hunt on Big Data.” Huffington Post, March 21. http://www.
huffingtonpost.com/2013/03/20/cia-gus-hunt-big-data_n_2917842.html (accessed May 15,
2014).
Sparkes, Matthew. 2014. “Ford Boss Retracts Claim that ‘We Know Everyone Who
Breaks the Law.’” The Telegraph, January 10. http://www.telegraph.co.uk/technology/
news/10563828/Ford-boss-retracts-claim-that-we-know-everyone-who-breaks-the-law.
html (accessed June 17, 2014).
The Economist. 2013. “Robot Recruiters: How Software Helps Firms Hire Workers More
Efficiently.” April 6. http://www.economist.com/news/business/21575820-how-softwarehelps-firms-hire-workers-more-efficiently-robot-recruiters (accessed May 15, 2014).
Waber, Ben. 2013. People Analytics. London: FT Press.
Webster, Frank. 2007. Theories of the Information Society. London: Routledge.
Weinberger, David. 2011. Too Big to Know: Rethinking Knowledge Now that the Facts Aren’t
the Facts, Experts Are Everywhere, and the Smartest Person in the Room Is the Room. New
York: Basic Books.
Wood, David. M., and Kirstie Ball. 2006. “A Report on the Surveillance Society.” Surveillance
Studies Network, UK. http://ico.org.uk/about_us/research/~/media/documents/library/
Data_Protection/Practical_application/SURVEILLANCE_SOCIETY_SUMMARY_06.
ashx (accessed May 15, 2014).
Zarsky, Tal. 2013. “Transparent Predictions.” University of Illinois Law Review 2013 (4):
1503–70.
Author Biographies
Mark Andrejevic is an associate professor in the Department of Media Studies, Pomona
College. He is the author of Reality TV: The Work of Being Watched, iSpy: Surveillance and
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
36
Television & New Media 16(1)
Power in the Interactive Era, and Infoglut: How Too Much Information Is Changing the Way
We Think and Know, as well as articles and book chapters on surveillance, digital media, and
popular culture.
Mark Burdon is a lecturer in the TC Beirne, School of Law, the University of Queensland. His
primary research interests are privacy law and the regulation of information sharing technologies. He has been a researcher on a diverse range of multi-disciplinary projects involving the
reporting of data breaches, e-government information frameworks, consumer protection in
e-commerce, and information protection standards for e-courts. His research is published in
leading law/technology journals in the United States, the EU, and Australia.
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
547774
research-article2014
TVNXXX10.1177/1527476414547774Television & New MediaGregg
Article
Inside the Data Spectacle
Television & New Media
2015, Vol. 16(1) 37­–51
© The Author(s) 2014
Reprints and permissions:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/1527476414547774
tvnm.sagepub.com
Melissa Gregg1
Abstract
This paper focuses first on the scopophilic aspects of large scale data visualization—
the fantasy of command and control through seeing—and places these in relation
to key sites and conventions inside the tech industry. John Caldwell’s notion of
“industrial reflexivity” provides a framework to explain the charismatic power and
performative effects that attend representations of data as a visual spectacle. Drawing
on twelve months of personal experience working for a large technology company,
and observations from a number of relevant showcases, conferences, and events, I
take a “production studies” approach to understand the forms of common sense
produced in industry settings. I then offer two examples of data work understood as
a new kind of “below the line” labor.
Keywords
Big data, data work, data sweat, below the line, scale, industry research
Accounting for the spectacle of Big Data1 entails understanding the aesthetic pleasure
and visual allure of witnessing large data sets at scale. This paper identifies the scopophilic tendency underwriting key sites and conventions inside the tech industry, which
pivot on large scale data set visualization. I use John Caldwell’s (2008) notion of
“industrial reflexivity” to explain the charismatic power and performative effects that
attend representations of data as a visual spectacle, namely, the fantasy of command
and control through seeing (Halpern 2014). Drawing on twelve months of personal
experience working for a large technology company, and observations from a number
of relevant showcases, conferences, and events, this “production studies” approach
(Mayer et al. 2009) illustrates the forms of common sense produced in industry settings.2 Due to the proprietary nature of high tech, few scholars have access to the
points of ideological and intellectual transfer in which the promises of Big Data are
1Intel
Corporation, USA
Corresponding Author:
Melissa Gregg, Intel Corporation, JF-2, 2111 NE 25th Ave, Hillsboro Or 97214, USA.
Email: Melissa.gregg@intel.com
from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
DownloadedDownloaded
from http://www.elearnica.ir
38
Television & New Media 16(1)
actively debated and constructed. I offer instructive examples of this process, negotiating the boundary of intellectual property restrictions and participant observation.3
The second objective of the paper is to theorize the labor of data. An important area
of attention in the emerging data economy is to assess exactly how users’ online activity involves them in profitable transactions, often without their knowledge (Scholz
2013). The analysis that follows adds nuance to this debate by identifying two instances
of “below the line” labor (Mayer 2011) in the Big Data era. The first of these is the
work of assembling the data spectacle, specifically the rhetorical work of the tech
demo in selling the visions on display. This genre and its default evangelism are normative features in the broader calendar of events for technology companies, large and
small. Combined, they are a leading instance of what Caldwell calls critical industrial
practice:
trade methods and conventions involving interpretive schemas (the “critical” dimension)
that are deployed within specific institutional contexts and relationships (the “industrial”
environment) when such activities are manifest during technical production tasks or
professional interactions (labor and “practice”). (Caldwell 2008, 1)
Professional interactions in the high tech industry involve generating commonsense assumptions—of technology’s benefits, of technological progress as inherently
good—a process that is pivotal to the broader experience of contemporary “data
work.”4 Pursuing an analogy between the Hollywood locations that are Caldwell’s
focus, and what is by now the rival center of mythologized cultural power in the United
States, Silicon Valley, I use an example from a recent developer forum in San Francisco
as an opportunity to unpack the ideological work of this type of industry event, one of
many routine settings in which Big Data rhetoric launches and lands.5 These elite
occasions for transferring insider knowledge operate as a flagpole running exercise for
messages that will be sold to consumers later in the product cycle. Yet their distance
from everyday users inevitably affects their ability to make appropriate judgments as
to market desire and need. As such, tech events often pivot on a combination of selfaggrandizement and hot air recycling referred to in the industry as “eating your own
dog food.”
The second aspect of “below the line” labor I attribute to Big Data is the work that
data does on our behalf, with or without informed consent. Recent popular distrust of
government agencies and technology companies colluding in the traffic of privileged
information reflects the growing realization that labor in the new economy is as much
a matter of non-human agency as it is the materiality of working bodies. After the
algorithm has been implemented, sensors, screens, and recording tools require little
human interference, even if the consequences of their scripts and commands only
become known after deployment. The political economy of data exhaust (A. Williams
2013)—or what I will call, using a more organic metaphor, data sweat—requires
deliberate strategies to overcome substantial power asymmetries (Brunton and
Nissenbaum 2011). Informed by recent media studies documenting the environmental
impact of machines that produce, harvest, and store Big Data (Gabrys 2011; Maxwell
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
39
Gregg
and Miller 2012), the second part of this paper offers concepts that endorse responsible
participation in a data economy. My hope is that these terms may assist in holding the
purveyors of our data accountable for their actions.
In the move to a more “material” media studies (Gillespie et al. 2014), there has
been a hesitancy to draw together humanistic thinking with notions of the non-human,
a blockage that prevents a holistic account of labor in the digital conjuncture.6 Bringing
these two aspects of data work together, I aim to demonstrate the combined relevance
of humanities and social science methods in highlighting the ethical dimensions of
technology innovation, which include the social consequences of data work at the
level of the worker and his or her data. Given my position within the tech industry, my
sense of the overall landscape for Big Data is perhaps more positive than others; it is
certainly more optimistic than my reference to Debord’s Society of the Spectacle
would imply. The objective of this article is to suggest that if the forms of representation that commoditize our experience are today primarily visual (Halpern 2014), then
television and new media scholars have a unique and urgent role.
Visual Pleasure and the Rhetoric of Data
The delight and comfort that can occur in the process of conceptualizing Big Data
comes, at least partially, from witnessing the achievement of large data sets represented at scale. The aesthetic pleasure summoned in these various constructions of
data—from word clouds to heat maps or the color codes of quantification platforms—
derives from their resolution of complex information through visual rhetoric (cf.
Massumi 2005). “Beautiful data” is the result of a century of modernist thought dedicated to adjusting the ways we see, visualize, and manage information. As Halpern
writes, in the Western tradition, vision “operates metaphorically as a term organizing
how we know about and represent the world” (Halpern 2014, 19). It is
a metaphor for knowledge, and for the command over a world beyond or outside or
subjective experience. To be seen by another, to see, to be objective, to survey, all these
definitions apply in etymology and philosophy to the Latin root—videre. (Halpern 2014)
Sharing the same root as the word “evidence,” vision is the word that aligns truth
and knowledge in different historical moments. In the case of Big Data visualization,
it is “about making the inhuman, that which is beyond or outside sensory recognition,
relatable to the human being . . . the formulation of an interaction between different
scales and agents—human, network, global, non-human” (Halpern 2014, 18). The
tech industry competes to provide this super-human insight via unique tools of data
assembly. This explains why in corporate settings, the possibility of data visualization
is regularly celebrated at the expense of considering the materiality of that which is
processed. A recent company showcase provides a case in point.
At a demo booth illustrating the work of a research center dedicated to Big Data,
onlookers were encouraged to watch, electrified, as synchronized TV screens displayed dynamic images and patterns panning out from a point of origin. The effect of
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
40
Television & New Media 16(1)
Figure 1. Bruno Latour’s closing plenary, ACM SIG-CHI, Paris, 2013.
ACM = Association for Computing Machinery’s; CHI = Computer Human Interaction.
this performance was doubtlessly impressive, even if, to a lay viewer, the morphing
blobs of color brought to mind little more than the lava lamps and fashions of 1970’s
disco. Engaging the spectator’s vision, simulating the experience of traversing (if not
quite “tripping”) through data, the demo served the purpose of illustrating the vastness
of the information being navigated. Yet when the presenter was asked, “What is the
data set we are seeing?” it became clear that the data itself was fictive. There was no
actual sample underwriting the demo, it was just a demo. The source of the data was
irrelevant for a genre that only requires the indication of potential to achieve veracity.
Like the trade rituals of film and video production, the tech demo exists within a wider
ecology of “subjunctive” thinking that is the default mode of the developer forum: a
means for “imagining—and showcasing—industrial possibilities on a liminal/corporate stage” (Caldwell 2008, 105).
The affective properties of data visualization summoned by and through the demo
bring to mind previous examples of the representing scale—the 1977 Ray and Charles
Eames film, Powers of Ten, being the most familiar.7 In this sense, it was only fitting
that a keynote speaker for the 2013 Association for Computing Machinery’s Computer
Human Interaction (ACM SIG-CHI) conference in Paris was local sociologist Bruno
Latour. The “expansive view” Latour chose to critique in his address [See Figure 1]
drew from his previous writing on monadology (Latour et al. 2012). This work is
informed by the ideas of Gabriel Tarde, and before him, Gottfried Leibniz, whose
mathematical modeling questioned neat distinctions between individual and collective
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
41
Gregg
phenomena. At a conference dominated by discussions about Big Data, Latour challenged the congregation of industry and academic researchers, many of whom had
relied on “the fallacy of the zoom” in their empirical reliance on data visualization. In
Latour’s argument, a collective view provides no more accurate a representation than
that of an individual—indeed, it is precisely the move to an expansive view that threatens accuracy and specificity.
Latour’s career-long investigations highlight the role played by tools in assembling
vision. He questions the status and veracity of scale as a means of authorizing vision,
and points to the labor left out of the frame, lens, or medium through which we view
representations of reality. This approach acknowledges the selective nature of that
which is “given” in what we think we see. The tool of assembly (the camera, say, or
the algorithm) has agency in shaping sight toward certainties of apprehension. This
recognition allows a degree of caution in thinking about Big Data when to do so means
becoming unusually enamored with vision. It also suggests the relevance of aesthetics
in explaining the role that visual pleasure plays in securing solace, excitement, and
trust (Mulvey 1975).
The authority we attribute to scale is the result of historical accretion. According to
Anna McCarthy (2006), initial definitions of scale rested on the musical sense of capturing a sequence of notes in order. Think of the gradually ascending tone structure of
instruments we understand to be producing notes higher as opposed to lower in pitch.
Like climbing a ladder, the series or progression implied in the idea of scale is a neat
way to conceive relative order. We progress by degrees through positions that are
taken to be naturally equidistant. Of the seventeenth-century thinkers McCarthy determines as asserting this basic metaphysical hierarchy, Francis Bacon brought mathematical systematicity to the idea of scale. Central to this is an understanding of scale
as proportion, which allows the significance of something to be observed “simply by
comparing it to other things, without reference to external standards of judgment”
(McCarthy 2006, 22). As a mode of reasoning, scale eventually stretched to influence
not only practices of mapping geographical territory but also nascent ideas of political
representation. Bearing resemblance to a thing—for example, a constituency—confirmed the ability for something or someone to stand in place of and for others. This
was also the period in which scale took on adjectival form. The consequences of this
have proven resilient in the longer history of epistemology. Scale provides a “mechanism of translation, or mapping, which connects material things and their representations in a precise, repeatable, and empirically known relationship which extends to the
process of representation in thought” (McCarthy 2006, 23). Reason could move from
the particular to the universal only as a result of these early articulations, which
bestowed an obvious logic to graduating concepts of measure.
In McCarthy’s reading, scale “helps stabilize a necessarily murky dichotomy: the
relationship between physical observation and mental speculation in inductive reasoning.” From spatial representations of hierarchy (epitomized in the ladder) to dominant
ideas of proportion (e.g., the map), a critical leap is necessary to join individual phenomena and broader conditions. Constructing the bridge between these two measures,
“scale regularizes the process of knowledge production by implying that there is a
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
42
Television & New Media 16(1)
proportional relation between the datum, the definite axiom, and the general axiom”
(McCarthy 2006, 24). The point here is that scale took on the function of reason through
an induction, which constitutes a rhetorical maneuver. To summon the term scale is to
mobilize “a thread of action and rhetoric actively connecting thought and thing, observation and speculation” (McCarthy 2006, 25). The execution of this link, and the continuum of empirical validity it suggests, is what we see playing out in tech demos today.
Presenting data at scale invokes an epistemological claim in the mere act of display. It
makes permanent what was once only plausible—a “cultural performance” of meaning
that, while lacking a sound empirical referent, bears the hallmarks of the “instrumental
and inductive perspective” favored in industry thinking (Caldwell 2008, 18).
Daniel Rosenberg (2013) offers another means by which to think historically about
data’s rhetorical work. In previous centuries, he suggests, “datum” was understood as
something given in an argument, something taken for granted. The obviousness of
data, its taken-for-granted-ness, emanated from the Latin origin of the word, which in
the singular means “gift,” or something that is “given.” In the domain of philosophy,
religion, and mathematics, data was used throughout the seventeenth century to designate that category of facts and principles that were beyond debate. It referred to things
that were assumed, essential, and hence already known before a problem was introduced for discussion. Data contained the parameters for thinking, the foundation upon
which later deductions would take place. Data is not, therefore, the same thing as fact.
Data is something presumed prior to discussion, a framework creating the possibility
for discussion. It therefore already contains judgments and decisions about what
counts as a prior-ity (both priority and a priori share the same Latin root; priorities are
taken from that which comes before). A data “set,” then, “is already interpreted by the
fact that it is a set,” according to Travis D. Williams: “some elements are privileged by
inclusion, while others are denied relevance through exclusion” (2003, 41). Like
McCarthy’s etymology of scale, these details draw attention to the cultural specificity
of reasoning. Even within the context of the English language, from previous usage,
we see that
facts are ontological, evidence is epistemological, data is rhetorical. A datum may also be
a fact, just as a fact may be evidence. But, from its first vernacular formulation, the
existence of a datum has been independent of any consideration of corresponding
ontological truth. (Rosenberg 2013, 18)
Rhetoric is a strategy of persuasion in the classical tradition. It is the art of convincing
others the veracity and truth of something in spite of selective emphasis and exposure.
So while we might continue to think of data as that which is given, as that which is
regarded as bearing truth, we can see that the term’s shifting emphasis throughout history removes considerations of partiality. Only recently did it become typical “to think
of data as the result of an investigation rather than its premise” (T. D. Williams 2013, 33).
In the scripts tech workers perform during a demo, data’s power lies in the assumption that it is synonymous with fact. In the future-oriented mode of the genre, historicity is removed, and the benefits of the knowledge being assembled and transferred are
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
43
Gregg
common sense. Taking a production studies approach, the further rhetorical effect at
play in this process is the entrepreneurial imperative of the evangelist. If Caldwell
warns of the dangers of industry-supplied PR in the Hollywood scene, and develops
scrupulous methods to contextualize partisan spin, the digital optimism and venturecapital-directed pitching that constitutes the tech demo requires similar analytical precision. It is not just the urgency and brevity of the encounter that illustrates the central
role of rhetoric in this default industry ritual. In the developer forum, the selective
showcasing of products and prototypes creates its own revelation, a preferred take on
the best that a company currently has to offer. In these settings, all encounters have the
character of a pitch (Gill 2011), right down to the questions of journalists and industry
analysts whose career status rides in tandem with the quality of insights and scoops
provided by a company’s star media performers. The hierarchy of access constituting
these events means it is never simply a matter of reporting objectively from the showcase on offer but securing invitations to additional features and segments of uninterrupted time with talent. Persuasion operates on a multitude of levels: in the data being
presented, in the scripted lines of the worker out front of the demo, and in gaining
access to what is a heavily orchestrated display of the present and future of computing.
It continues into the press briefings, Twitter feeds, and column inches that construct
the public’s apparently insatiable appetite for new media devices, technologies, and
apps. In addition to the visual pleasure and power of data on display, then, the work
involved in assembling and authorizing the spectacle taking place within the convention center, tech campus, or downtown hotel is performed by a host of subsidiary
workers acting after the fact, to one side, behind-the-scenes, and after hours.
Data Agents
If demo booths are a crucial site for the assembly and rhetorical illustration of Big
Data’s commercial potential, the work that data does on our behalf—through data mining practices and other forms of network analysis—is an already established area of
concern for media studies (e.g., Andrejevic 2013; Arvidsson 2011). From an industry
perspective, the challenge posed by the data economy is less to do with limiting the
scope of algorithmic surveillance as it is a race to define a profitable vocabulary for
transactions that have the potential to bring new opportunities for connection,
exchange, and wonder.8 If the prospect of data forming social relationships on our
behalf brings untold risks, a business point of view sees infinite possibilities. The proliferation of music recommendation services (Seaver 2012) and online dating sites
(Slater 2013) are just two of these convivial applications, in addition to the so-called
sharing economy. With data as our agent, matching information with or without our
direct involvement, algorithms create new matches, suggestions, and relationships that
we are unable to achieve on our own. Data agents allow us to contemplate and revel in
the possibilities afforded by strangers (Bezaitis 2013), whose profiles and tastes might
anticipate or assuage our time-pressed needs. The very secrecy of online algorithmic
sorting—the extent to which hook-up sites and platforms flourish through the partial
revelation of identities and locations, for example—can foster collective social
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
44
Television & New Media 16(1)
practices that mainstream cultures may not wish to draw to light, presenting a boon for
sexual and other minorities (Race, forthcoming).
My use of the term data agent thus refers to occasions in which the sorting, categorizing, and matching capabilities of data algorithms act as a highly competent appendage, a publicist, or even, to adopt some detective imagery, our shadow. In the world of
Caldwell’s Hollywood, of course, agents have their own role. Agents act behind the
scenes—their work happens to the side and in the background of stages upon which
more visibly rewarding and profitable performances take place. Yet the agent’s work
is essential in filtering a surfeit of information to a manageable and actionable set of
options, matching available opportunities with potential investments. In the future
already being built, the data we produce will be used to do something similar, that is,
to work through algorithms to make decisions in our best interests, to sift out attractive
or unsuitable options, and to favor encounters that accord with previously identified
preferences. This is one way that data will entail new kinds of agency (if not an actually existing, incorporated agency, such as the talent scout . . . although there may be
merit in experimenting with this analogy too).
Decades ago, in The Presentation of Self in Everyday Life, Erving Goffman ([1959]
1973) relied on a similarly theatrical framework in his theory of region behavior. He
divided social performances into two realms: the front region, which was deemed to
be action on show to a public, and the back region, the site of relaxation and regeneration. Goffman suggested both regions host carefully cultivated performances that
respond to cues elicited and interpreted in their respective settings. In the data society,
a great deal of social work takes place off-stage, by non-human agents, as a result of
processing choices engineered by computers. These programming decisions are made
before any audience or user encounters the stage upon which communication later
takes place. In orchestrating the setting for an encounter, algorithms and platforms are
default editors for social messages. In assembling and choreographing the stage for
digitally mediated performances, they also incorporate the work of key grip and set
designer. An entire production studies lifeworld is employed in this complex infrastructure through which our data is assembled, and rendered visible and profitable. To
recognize these layers thus requires engaging at multiple levels, part of a broader
project of understanding the worth of “below the line” labor (Mayer 2011).
Data Sweat
Yet the idea of data agents still presumes a degree of distance between the individual
and the information that circulates about an individual. It implies segregation as much
as a process: I give my data to someone or something that can use it, hopefully to my
advantage. Any number of events suggests the naivety of this aspiration, especially
where there is a profit to be made. A more accurate way to think about our relation to
data that avoids this gift economy is through the body. It is true, for example, that data
may act like a shadow at times: our identifying data casts a shadow when we place
ourselves in the glare of certain platforms or transactions. When recorded and processed at scale, data offers a rough outline of who we are and the form and function of
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
45
Gregg
our digital projection for anyone motivated and literate enough to see. But this kind of
analogy suggests we have some say in the interactions we choose to make, that we can
predict, like the turning of the sun, the ways in which our data will be rendered visible
and available. Instead of the visual metaphor of the shadow, then, we might consider
an alternative and more visceral language to think past ocular-centric ideas of information sovereignty.
The idea of data sweat came to me in the course of giving a talk as a visiting speaker
at a virus protection company in Taipei. The topic for discussion was data privacy and
security, and as we were chatting, the air-conditioned building had a varied effect on
the workers in attendance. Sitting in the crowded room, each person had their own way
of dealing with the pre-typhoon heat, from fanning to slouching to wiping damp brows.
Locals knew that any attempt to leave the building to walk the mid-afternoon streets
would lead to gross discomfort. This contextual awareness led them to make all kinds
of climate-dependent decisions, from choice of footwear (no heels) to transport (train
or taxi), or just staying late at the office. One of the most enthusiastic audience members to introduce herself following my talk carried a tissue in hand to ameliorate her
facial sweat, a taken-for-granted part of her daily ensemble.
Sweat is a characteristically human trait. It is a vital sign that our bodies are working, even if cultural norms differ as to how much this expression should be public. In
some cultures, for example, sweat can show enlightenment, possession, or commitment. It can just as easily suggest fear, anxiety, or arousal. Given this, sweat can appear
when we may not want it. A whole industry of perfumes, deodorants, and other innovations now accommodates the need for disguise and masquerade in the process of
maintaining social acceptability. Organic, corporeal phenomena such as sweat (but
also microbes and genomes)9 illustrate the existence of data that is essential about us.
This is data that speaks, albeit voicelessly, on our behalf. Sweat literalizes porosity: it
seeps out at times and in contexts that we may wish it did not. It can be an annoyance
or an accomplishment depending on the situation. But it is always a measure of our
participation, our vitalism, and our presence in the social. Sweat leaves a trace of how
we pass through the world and how we are touched by it in return. It is the classic
means by which the body signals its capacity to “affect and be affected,” to use
Spinoza’s terms. Understood this way, the labor we engage in as we exercise and
exchange our data—especially in our efforts to clean up our image, present a hygienic
picture, and make ourselves look good—is a kind of sweat equity for the digital economy.10 It is a form of work we perform in the attempt to control what is ultimately out
of our capacity.11
The current experience of Big Data is one in which powerful interests benefit from
exploiting this lack of control. Turning the frame from one of personal sovereignty to
data sweat gives us a better way of recognizing a rights-based contribution to this economy; it describes the particular form of labor contributing to this common wealth (Hardt
and Negri 2009). This is not labor that can be measured in terms of hours worked on the
clock. To paraphrase Gordon Gekko: “data never sleeps.” Data work is beyond the measure of “clock time,” and yet, to the extent that it generates profits that require compensation, it requires us to think about value beyond measure. As Adkins (2009) argues,
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
46
Television & New Media 16(1)
While the break with the hegemony of clock time may lead to a break with certain kinds
of measure—especially those forms which operate externally to entities—this break may
also involve the emergence of new kinds of measure, specifically ones whose co-ordinates
may emerge from entities themselves.
Data Exhaust
To move toward such an alternative way of thinking, I want to conclude by pushing the
idea of data sweat to a plausible endpoint, through the notion of exhaust. This is not to
signal exhaustion, since we have seen how data production and management takes
place happily backstage, with or without our conscious effort. But rather, if data is a
trail that we leave in our wake as a result of our encounters with the world and things,
then this trail clearly has some undesirable effects. Within the tech industry, “data
exhaust” or “tertiary data” names the value that our presence retains after a unique
transaction (A Williams 2013). It is used to quantify the multiple applications that our
digital identity provides beyond the gestures of an initial performance, to build business models based on the profits predicted from behavior cast by data. But exhaust is
a term with further connotations, especially when thinking ecologically about the hazards posed by the massive computation of data on an increasingly fragile
environment.
The clearest example of the environmental impact of Big Data is the investment in
property and electricity now required by server farms that hold the world’s seemingly
infinite packets of information. If data is the new oil, then data centers are the ports,
wells, and tankers. The move to “cloud computing” is nothing if not a misnomer in this
regard. Data that appears to be pushed to some higher, opaque place requires enormous physical infrastructure on the ground. To ignore these relationships, and the
geopolitics they engender, is to perpetuate long-standing asymmetries in the experience of computing (Pellow and Park 2002).
The further consequences of the data traffic moving between pipes and satellites
across the globe include the logistical transfer, freight, assembly, and dis-assembly of
always imminently redundant hardware (Rossiter 2014). Activists are documenting
the human impact of this transport, manufacturing, and scavenging ecology, from the
labor camps attached to Foxconn factories (Andrijasevic and Sacchetto 2013) to the
Coltan mines of the Congo.12 As wealthy countries ship toxic e-waste back to the point
of origin for disposal, the pleasures enjoyed through new social networks generate an
international chain of service and manual labor. To evoke the legacy of an earlier
moment of dystopic web theory, Big Data today translates to even bigger “data trash”
(Kroker and Weinstein 1994).
Beyond the Sovereign Spectacle
An awareness of data exhaust invites us to take responsibility for the colonial legacy
underwriting Silicon Valley mythology (Dourish and Mainwaring 2012)—the material
conditions attached to the abstract philosophy of freedom through computing. If our
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
47
Gregg
ideas of data are to remain wedded to the imaginary of prosthetics (something that is
attached to, once it is taken from us), then ideas of sweat and exhaust may yet prove to
have mobilizing potential. They can bring an assessment of environmental justice to
bear upon the empowering mythologies emanating from Silicon Valley. The view I
advocate in this paper, then, is that notions of personhood and sovereignty that perpetuate the fallacy that we can control our data will not assist in the cause of advancing
an ethical data economy. We need terms that account for data’s agency in tandem with
the human consequences of this new mode of production. Film and television studies
provide a register to explain this double movement, in which the assembly of data and
its capacity to act on our behalf each instantiate a form of “below the line” labor.
In his classic account of The Gift, Marcel Mauss ([1922] 1990) explains that nothing of value ever really comes for free. The forms of obligation that accompany a gift
are social and pressing. They involve calculations of honor, status, and reciprocity. To
offer a gift is to offer a part of oneself—the object is “never completely separated”
from the instigator of the exchange. In a highly mediated economy, in which data is
often traded without our knowledge, Mauss’s theory takes an interesting twist. If we
are never fully aware of the context in which our data is given, the social bond that is
formed lacks guidelines and nuance. The terms of obligation demanded of the giver
and receiver remain compromised and unclear.
To date, Big Data has appeared as a gift for tech companies seeking to reinvent
themselves from the triumphant years of desktop computing and lead the charge into
a new market for software services, security, and storage. As this frenzy has taken
place, we have lacked a human vision of rights in what is now regularly referred to
as an “Internet of Things.” Television and new media studies have always acknowledged connections between the worlds of business, entertainment, and everyday life,
and governance (Andrejevic 2004; Miller 2001; Ouellette and Hay 2008). And just
as audience studies needed the insights of production studies to square the account,
Big Data demands analyses that are attuned to both on-screen and behind-the-scenes
components of digital life. This paper identifies a vital role for new media theory in
encouraging better descriptions of data work. Applying media studies methods to
Silicon Valley not only expands the reach and purchase of these legacies for a new
moment but also creates a new set of political and ethical questions for the field.
Writing from an industry position—from inside the data spectacle—I hope to
encourage greater numbers of voices and actors to engage directly with those working “below the line” in the data economy, to speak loudly in support of different and
more inclusive casting choices and participants, and to drive different possibilities
for computing and data processing from within. In the data industries of the future,
a range of skills and literacies are going to be necessary to maintain just and fair
opportunities for all. As I have shown, it is the rhetorical and visual effects of data
compiled in the aggregate that television and new media studies are especially well
placed to assess. The aura enacted in the performance of the data spectacle demands
both theoretical precision and appropriate accountability. It requires new rights to be
imagined and secured for the mass of individuals currently captured in—if not
wholly captivated by—Big Data visions.
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
48
Television & New Media 16(1)
Declaration of Conflicting Interests
The author declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.
Funding
The author received no financial support for the research, authorship, and/or publication of this
article.
Notes
1. I use the capitalized proper noun throughout in recognition of the special issue this article
joins. For a more specific discussion and critique of the Big Data conjunction and its present popularity, see the collection of papers assembled from research in the Intel Center for
Social Computing in Maurer (forthcoming).
2. Writing this paper coincided with my first year as Principal Engineer in User Experience
at Intel Labs, USA. As co-director of the Intel Science and Technology Center (ISTC) for
Social Computing, my role is to work with academic partners across multiple universities on five organizing themes: algorithmic living, creativity and collectivity, materialities
of information, subjectivities of information, and information ecosystems. These topics
provide a framework for collaborative research that guides industry professionals to better
understand the social aspects of computing that may be overlooked in traditional engineering approaches. This paper draws on observations and conversations at a range of ISTC
and tech industry events in the United States, Europe, and Taiwan over a twelve-month
period. Specific conversations are acknowledged where possible.
3. While my key reference for this kind of industrial reflexivity is Caldwell (2008), another
inspiration for this paper is Georgina Born (2004), whose rigorous study of machinations
within the BBC was a source of consolation throughout my first year at a leading technology company.
4. I am indebted to Katie Pine for this term and ongoing observations of how instruments for
auditing, accountability, and measure affect the everyday experience of a range of workers, especially in the fields of health care and medical practice. See, for example, Pine and
Mazmanian (2014).
5. Caldwell’s notion of production culture explains the behind-the-scenes labor underwriting Hollywood’s primary position in the film and television industry. It also offers a useful frame for the unique configuration of cultural authority now emanating from Silicon
Valley. Social anxieties currently attached to tech work in the Bay Area bear an interesting
correlation to previous concerns about television. To name just a few, how each communication technology (television vs. the Internet) creates a new industry for targeted advertising; the overinflated concentration of industry talent in one geographical area (LA vs.
San Francisco); the celebrity status of key participants (screen stars vs. hackers), and their
exceptionalism in the face of social norms; let alone the universalizing ideological aspirations of the industry as a whole, which, as a form of “soft power” in international trade
and diplomacy, acts as an index of U.S. imperialism. Thanks to Jason Wilson for helpful
conversations on these points.
6. Referencing the new materialism risks conflating specific traditions of thinking that
encompass the actor-network theories and applications inspired primarily by the work
of Bruno Latour, various strands of materialism understood through Deleuzian vitalism
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
49
Gregg
7.
8.
9.
10.
11.
12.
(e.g., Braidotti 2013), German media theory traditions now most closely aligned with
writers such as Parikka (2012), and object-oriented ontology (Harman 2002). In the Intel
Science and Technology Center (ISTC) for Social Computing, the materiality of information theme has conducted research on auditing and measure that accompany the quantification of society (see Nafus, forthcoming); it also refers to the material practices of making,
hacking, and repurposing that are accompanying the rise of consumer DIY electronics and
maker culture. For another attempt to avoid binaristic thinking in labor theory, see Qiu
et al. (2014).
See http://www.powersof10.com/film. Accessed June 15, 2014.
The somewhat discordant experience of intimacy produced through this novel combination
of global communications infrastructure, logistics, and system sorting is deftly captured in
the Facebook slogan, “Ship Love” (Sloane 2014).
Thanks to Lana Swarz for prompting this idea.
Thanks to Ken Anderson for the idea of “sweat equity,” and for many other forms of support as I wrote this article.
Ellie Harmon takes this idea one step further to suggest that companies such as Facebook
are like the bacteria that live on our bodies and sweat. Personal communication, June 25,
2014.
See http://www.gongchao.org/en/frontpage for updates on Foxconn in particular. Accessed
June 15, 2014. The Guardian has covered the ethics of Coltan mining for several years:
see Taylor (2011) for a moving example. In January 2014, Intel CEO Brian Krzanich
announced a new industry standard for sourcing “conflict free” minerals. See http://www.
intel.com/content/www/us/en/corporate-responsibility/conflict-free-minerals.h-tml
and
related activism through the “Enough” project: http://www2.american-progress.org/t/1676/
campaign.jsp?campaign_KEY=6265. Accessed June 15, 2014.
References
Adkins, Lisa. 2009. “Feminism after Measure.” Feminist Theory 10 (3): 323–39.
Andrejevic, Mark. 2004. Reality TV: The Work of Being Watched. Lanham: Rowman &
Littlefield.
Andrejevic, Mark. 2013. Infoglut: How Too Much Information Is Changing the Way We Think
and Know. New York: Routledge.
Andrijasevic, Rutvica, and Devi Sacchetto. 2013. “China May Be Far Away but Foxconn Is on
Our Doorstep.” Open Democracy, June 5. http://www.opendemocracy.net/rutvica-andrijasevic-devi-sacchetto/china-may-be-far-away-but-foxconn-is-on-our-doorstep (accessed
August 16, 2013).
Arvidsson, Adam. 2011. “General Sentiment: How Value and Affect Converge in the
Information Economy.” In Sociological Review Monograph Series: Measure and Value,
edited by Lisa Adkins and Celia Lury, 39–59. London: Wiley-Blackwell.
Bezaitis, Maria. 2013. “The Surprising Need for Strangeness.” TED@Intel. http://www.ted.
com/talks/maria_bezaitis_the_surprising_need_for_strangeness.html (accessed August 15,
2013).
Born, Georgina. 2004. Uncertain Vision: Birt, Dyke and the Reinvention of the BBC. London:
Random House.
Braidotti, Rosi. 2013. The Posthuman. Cambridge: Polity.
Brunton, Finn, and Helen Nissenbaum. 2011. “Vernacular Resistance to Data Collection
and Analysis: A Political Theory of Obfuscation.” First Monday 16 (5). http://dx.doi.
org/10.5210/fm.v16i5.3493 (accessed June 8, 2014).
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
50
Television & New Media 16(1)
Caldwell, John T. 2008. Production Culture: Industrial Reflexivity and Critical Practice in Film
and Television. Durham: Duke University Press.
Dourish, Paul, and Scott Mainwaring. 2012. “Ubicomp’s Colonial Impulse.” In Proceedings
of ACM Conference in Ubiquitous Computing, 133–42. Pittsburgh, PA: Association for
Computing Machinery.
Gabrys, Jennifer. 2011. Digital Rubbish: A Natural History of Electronics. Ann Arbor:
University of Michigan Press.
Gill, Rosalind. 2011. “‘Life is a Pitch’: Managing the Self in New Media Work.” In Managing
Media Work, edited by Mark Deuze, 249–62. Thousand Oaks: Sage.
Gillespie, Tarleton, Pablo J. Boczkowski, and Kirsten A. Foot. 2014. Media Technologies:
Essays on Communication, Materiality, and Society. Cambridge: MIT Press.
Goffman, Erving. (1959) 1973. The Presentation of Self in Everyday Life. New York: Anchor
Books.
Halpern, Orit. 2014. Beautiful Data: A History of Vision and Reason since 1945. Durham: Duke
University Press.
Hardt, Michael, and Antonio Negri. 2009. Commonwealth. Cambridge: The Belknap Press of
Harvard University Press.
Harman, Graham. 2002. Tool-Being: Heidegger and the Metaphysics of Objects. Chicago:
Open Court.
Kroker, Arthur, and Michael A. Weinstein. 1994. Data Trash: The Theory of the Virtual Class.
New York: St. Martin’s Press.
Latour, Bruno. 2013. “From Aggregation to Navigation: A Few Challenges for Social Theory.”
Keynote address to the ACM SIG-CHI Conference, Paris, April.
Latour, Bruno, Pablo Jensen, Tommaso Venturini, Sébastian Grauwin, and Dominique Boullier.
2012. “The Whole Is Always Smaller than Its Parts: A Digital Test of Gabriel Tarde’s
Monads.” British Journal of Sociology 63 (4): 591–615.
Massumi, Brian. 2005. “Fear (The Spectrum Said).” Positions 13 (1): 31–48.
Maurer, Bill. Forthcoming. Big Data. Prickly Paradigm Press.
Mauss, Marcel. (1922) 1990. The Gift: Forms and Functions of Exchange in Archaic Societies.
London: Routledge.
Maxwell, Richard, and Toby Miller. 2012. Greening the Media. New York: Oxford University
Press.
Mayer, Vicky. 2011. Below the Line: Producers and Production Studies in the New Television
Economy. Durham: Duke University Press.
Mayer, Vicky, Miranda J. Banks, and John Thornton Caldwell. 2009. Production Studies:
Cultural Studies of Media Industries. London: Routledge.
McCarthy, Anna. 2006. “From the Ordinary to the Concrete: Cultural Studies and the Politics
of Scale.” In Questions of Method in Cultural Studies, edited by Mimi White and James
Schwoch, 21–53. Malden: Blackwell.
Miller, Toby, with Nitin Govil, John McMurria, and Richard Maxwell. 2001. Global Hollywood.
London: British Film Institute.
Mulvey, Laura. 1975. “Visual Pleasure and Narrative Cinema.” Screen 16 (3): 6–18.
Nafus, Dawn. Forthcoming. The Quantified Self. Cambridge: MIT Press.
Ouellette, Laurie, and James Hay. 2008. Better Living through Reality TV: Television and Postwelfare Citizenship. Malden: Blackwell.
Parikka, Jussi. 2012. What Is Media Archeology? Cambridge: Polity.
Pellow, David, and Lisa Sun-Hee Park. 2002. The Silicon Valley of Dreams: Environmental
Injustice, Immigrant Workers, and the High-Tech Global Economy. Cambridge: MIT Press.
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
51
Gregg
Pine, Kathleen, and Melissa Mazmanian. 2014. “Institutional Logics of the EMR and the
Problem of ‘Perfect’ but Inaccurate Accounts.” In Proceedings of ACM Conference on
Computer Supported Cooperative Work, 283–294.
Qiu, Jack Linchuan, Melissa Gregg and Kate Crawford. 2014. Circuits of Labor: A Labor
Theory of the iPhone Era. tripleC: Communication, Capitalism & Critique. Forthcoming.
Race, Kane. Forthcoming. “Party ‘n’ Play: Online Hook-Up Devices and the Emergence of PNP
Practices among Gay Men.” Sexualities.
Rosenberg, Daniel. 2013. “Data before the Fact.” In Raw Data Is an Oxymoron, edited by Lisa
Gitelman, 15–40. Cambridge: MIT Press.
Rossiter, Ned. 2014. “Logistical Worlds.” Cultural Studies Review 20 (1): 53–76.
Scholz, Trebor. 2013. Digital Labor: The Internet as Playground and Factory. New York:
Routledge.
Seaver, Nick. 2012. “Algorithmic Recommendations and Synaptic Functions.” Limn 2: Crowds
and Clouds, August 16. http://limn.it/algorithmic-recommendations-and-synaptic-functions (accessed August 5, 2014).
Slater, Dan. 2013. Love in the Time of Algorithms: What Technology Does to Meeting and
Mating. London: Penguin Books.
Sloane, Garrett. 2014. “Mark Zuckerberg Gets Reflective as He Nears 30 Espouses Motto of
‘Ship Love’ at the f8 Conference.” Adweek, April 30. http://www.adweek.com/news/technology/mark-zuckerberg-gets-reflective-he-nears-30-157394 (accessed August 5, 2014).
Taylor, Diane. 2011. “Congo Rape Victims Face Slavery in Gold and Mineral Mines.” The
Guardian, September 2. http://www.theguardian.com/world/2011/sep/02/congo-womenface-slavery-mines (accessed August 5, 2014).
Williams, Alex. 2013. “The Power of Data Exhaust.” TechCrunch, May 26. http://techcrunch.
com/2013/05/26/the-power-of-data-exhaust/ (accessed September 9, 2013).
Williams, Travis D. 2013. “Procrustean Marxism and Subjective Rigor: Early Modern
Arithmetic and Its Readers.” In Raw Data Is an Oxymoron, edited by Lisa Gitelman, 41–59.
Cambridge: MIT Press.
Author Biography
Melissa Gregg is a Principal Engineer and researcher at Intel Corporation. Her publications
include Work’s Intimacy (Polity 2011), The Affect Theory Reader (co-edited with Gregory J.
Seigworth, Duke 2010), Cultural Studies’ Affective Voices (Palgrave 2006), and Willunga
Connects: A Baseline Study of Pre-NBN Willunga (2011).
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
527137
research-article2014
TVNXXX10.1177/1527476414527137Television & New MediaBolaño and Vieira
Article
The Political Economy
of the Internet: Social
Networking Sites and a
Reply to Fuchs
Television & New Media
2015, Vol. 16(1) 52­–61
© The Author(s) 2014
Reprints and permissions:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/1527476414527137
tvnm.sagepub.com
César R. S. Bolaño1 and Eloy S. Vieira1
Abstract
The privatization of the Internet meant not simply a passage from a state-logic
organization to an economic one but something more complex. The year 1995
marked a disruption when the National Science Foundation (NSF), the public agency
that controlled and exploited the network, transferred its regulatory responsibilities
to the private sector. Despite the system’s provision of free access to information, the
Internet’s entire economic logic was modified when advertising became the standard
norm. The objective of this article is to summarize the history of the Internet and the
points that are important to understanding its actual political and economic logic via
an emphasis on social networking sites. Our argument also involves a Marxist critique
of a theoretical element that Fuchs has contributed to this discussion.
Keywords
Internet, political economy of the Internet, social networking sites, class struggle,
capitalism
Of Internet Political Economy: A Brief History
Supported by the Department of Defense during the Cold War, the U.S. government
joined scientists and militaries to develop a network that could grant information security in case of nuclear attacks by the Soviet Union.1 From their efforts, military and
1Universidade
Federal de Sergipe, São Cristóvão, Brazil
Corresponding Author:
César R. S. Bolaño, Departamento de Economia, Centro de Ciências Sociais Aplicadas, Universidade
Federal de Sergipe, Av. Marechal Rondon S/N - Sala 50, Andar Superior do CCSA, Nucleo de PósGraduação em Economia, Jardim Rosa Elze, Sao Cristovao, SE 49100-000, Brazil.
Email: bolano.ufs@gmail.com
Downloaded
tvn.sagepub.com at NATL TAIWAN UNIV of Sci and Tech on December 27, 2014
Downloaded
fromfrom
http://www.elearnica.ir
Bolaño and Vieira
53
government officials, scientists, and high-tech workers created Arpanet. In this first
phase—from the 1960s to the late-1970s—the network grew slowly and gradually via
public investments. It focused on experimentation, which was crucial to the development of most of the network’s technological advances we have today, such as Ethernet
cable and Transmission Control Protocol/Internet Protocol (TCP/IP).
By the late-1970s, other entities entered the field when the public agency that controlled and exploited the network, the National Science Foundation (NSF), granted
these same capacities to the private sector. In 1979, the first information service,
known as Compuserve, was created. In 1985, the Domain Name System (DNS) ranked
machine connections over the network. At the same time, the Bulletin Board System
(BBS) started to be used as one of the first communications services through the network. It was developed by America Online, which became the world’s first major
Internet service provider (ISP) in the 1990s. The NSF made good use of these first
backbones for the system it created. Besides these technical advances, people looked
to create the necessary hardware to access the Internet. In 1989, Tim Berners-Lee and
Robert Caillau, both scientists from the Organisation Européenne pour la Recherche
Nucléaire (CERN), developed the web and released it in 1991 as the World Wide Web
(WWW). The WWW involved a new language pattern that allowed multidirectional
hypertext and required an Internet browser.
The year 1995 marked a disruption between these two models of organizing the
Internet. The NSF solely managed the network infrastructure, while private companies, such as Prodigy, AOL, Compuserve, and Teletel (France), became the first major
ISPs (Bolaño et al. 2011). This new regulation2 allowed these companies to explore
the market for the new network and profit from it.
Privatization allowed free access to information. Nevertheless, the entire logic of
the Internet was modified when advertising became the economic model. This meant
not simply the passage from a state-based economic logic to a commercial-based economic logic but something more complex. On one hand, from a public economy,
focused on state investments, to a market one, according to different kinds of commoditization and, on the other hand, from a political-military logic to a privatization,
regulation, and economical globalization one that intended to support the capitalistic
restructuration and the maintenance of U.S. economical hegemony in international
relations (Bolaño et al. 2011).
The possibilities of transforming small businesses managed by young college students to large Internet firms help to restore the old myth of “self-made man” brought
into the Internet business environment. In fact, it is an example of a spatially concentrated cluster of innovation firms that benefited from political decisions, linked to
important university centers, and was supported by major venture capital companies
(firms specialized in earning money by owning equity in the new companies, usually
start-ups and other high-risk and innovative businesses), the first investors of earlystaged businesses.
The founders and CEOs of companies such as Yahoo!, Google, and Facebook, for
example, came straight from Stanford University, where they were supported with
infrastructure like data servers, and received, in crucial moments, the support of
Downloaded from tvn.sagepub.com at NATL TAIWAN UNIV of Sci and Tech on December 27, 2014
54
Television & New Media 16(1)
venture capital investment companies such as Sequoia Capital that invested about
US$3 million in Yahoo in 19953 and, lately, together with Kleiner Perkins Caufield &
Byers, invested US$25 million in Google (1999).
This scenario, with support from the academy, the private sector, and the State, was
so attractive to venture capitalists that it was responsible for the Internet bubble in the
early 2000s. When Amazon.com share values surpassed Boeing’s in 1999, many other
online infrastructure companies’ shares had their shares overvalued. Nasdaq received
a major influx of capital, overvaluing infrastructure companies like Cisco Systems,
IBM, Informix, Oracle, Microsoft, and Sun Microsystems (Monteiro 2008). Then, the
Internet bubble popped. The companies that grew afterward are the main Internet players that we know nowadays.
The bubble made companies change their strategies. Thenceforward, they used the
Internet not only as a tool but also as a platform that was
characterized by the provision of services specifically the ones aimed at accessibility,
communication and information (network access providers, content hosting, e-mail, interest
groups, chat rooms, search engines, e-commerce, among others). So, companies that work
and profit (production/processing/distribution) from information . . . represent a new phase
of capitalist accumulation within the production of information. (Monteiro 2008)
The industrial capitalistic model of organization developed in the first half of the twentieth century produced and disseminated information, knowledge, and culture in
uneven levels for different media. The Internet is not only an information and communications technology (ICT), nor it is not only some kind of new industry, but actually it is a space for the convergence of all industrialized cultural production. The
Internet is the result of the development of new technologies and its interpretation
through global expansion (Bolaño et al. 2011).
The technological development that resulted in the creation of the Internet was only
the first step in establishing a new model of profit based in another model already
known by the Cultural Industry, namely, the audience commodity. The audience commodity is an intermediary product, traded in an intra-capitalistic market (Braz 2011),
that may attract the commercial and state interests at the same time. Much like the U.S.
television market, in which programs are offered for free to the audience, many
Internet services (e-mail, news, communication, weather, games, and freeware) are
offered free of charge to the users in order to get their attention. As with television, the
audience is the product. “The audience buyers are exactly the sellers of goods and
services, authorities, politicians, or, in just one word, everyone who needs to communicate with the audience” (Bolaño 2000, 115-116). Or according to Monteiro (2008),
“The migration of major trade companies, media and entertainment to the Internet
transformed the international network into another Culture Industry and social commoditization vehicle.” Before the Internet, companies never had as many opportunities to track and keep so much information about their customers. Today, the consumer’s
data chase the advertiser, not advertisers chasing consumers. This happened exactly
because the new platform permitted so much data storage that then could be repurposed and exploited (Fuchs, 2011).
Downloaded from tvn.sagepub.com at NATL TAIWAN UNIV of Sci and Tech on December 27, 2014
Bolaño and Vieira
55
So, any product or services offered by Internet companies have a double feature.
On one hand, they are commodities produced by informational companies. On the
other hand, even though they are offered for free, they are also the means to reproduce
advertisers’ capital in the final stage of the mercantile circulation process. Advertisers
effectively sponsor the system.4 Thus, we may conclude that there are no differences
between political economy of Internet and the twentieth-century culture industries. In
both cases, the concept of “commodity duplicity” (Bolaño 2000) will apply. At the
same time, there is an important difference between the television and the Internet. To
explain this difference, we turn to a discussion of social networking sites (SNSs).
The SNSs Model
In order not to confuse our topic with the ancient concept of a social network, we will
follow Recuero’s (2009) concept for SNSs as the Internet sites that host social networks. Although almost every communication system mediated with computers permit
social networks, what makes the SNSs different from other systems is the possibility to
construct and publish a social network through the web. SNSs allow users to (1) construct a public or semi-public profile within a bounded system, (2) articulate a list of
other users with whom they share a connection, and (3) view and traverse their list of
connections and those made by others within the system (boyd and Ellison 2007).
SNSs first appeared exactly when the Internet became available to ordinary users.
SixDegrees was the first SNS with these features in 1997. Besides the profiles, the users
could create friend lists and, in 1998, they could browse these lists. Despite one million
early adopters, the website did not meet investors’ expectations and closed the site in
2000. Afterward, other platforms were more successful due to their audience targeting.
Live Journal, Asianavenue, Blackplanet, Migente, Fotolog, LunarStorm, Cyworld, and
Ryze are the best examples. They gave way to three major SNSs in the early 2000s.
The first major SNS was Friendster. It had so many users that Google intended to buy
it in 2003 (Dybwad 2009). Even though it lost some users to MySpace, the second big
SNS, especially in the United States, Friendster received more than US$50 million in
venture capital. One of the main investors was MOL Global, the biggest Internet
Company in Asia. Based in Kuala Lampur, Malaysia, MOL acquired the company in
2009 for more than US$26 million (Arrington 2009). The company changed the focus of
the platform to online games and other entertainment products for Asian consumers.
Another notorious SNS since 2004 was MySpace. It was propelled by musicians
and indie groups using the SNS to publish their work and to host mp3 music files. In
2005, News Corporation bought MySpace from Intermix Media for US$580 million.
In the following year, the site faced phishing attempts, spam, and malwares, leading
many users to abandon the network. When Yahoo! tried to buy MySpace, the SNS was
said to be worth about US$12 billion (Aamoth 2008; see also Bolaño et al. 2013).
After a brief golden age, MySpace went into decline. It lost about 10 million users in
just one month (Barnett 2011) when the board of directors decided to change MySpace
from a SNS to a website that focused only on entertainment, music, TV, movies, and
celebrities. In 2011, News Corp. sold MySpace to Specific Media for US$35 million,
Downloaded from tvn.sagepub.com at NATL TAIWAN UNIV of Sci and Tech on December 27, 2014
56
Television & New Media 16(1)
only 6 percent of what News Corp. had paid. Specific Media tried to revive the site,
but the effort was not successful (Segall 2011).
The third great SNS market entrant was Orkut, a project designed by a Google
engineer of the same name in 2001. Google now had a SNS. Together with its surveillance and monitoring systems, Google could manage the information collected by the
SNS and cross-reference it with its other search engine databases. According to Bruno
(2006, 155–56), Google collects various categories of personal data because
[I]ts main objective is not to produce knowledge about a specific individual, but about
groups and populations organized by financial, biological, behavior, professional,
educational, actuarial, racial, geographic categories, and so on. This is an infra-individual
level of use. Meanwhile, the database is not merely an archive, but carries the functions
of registering, classifying, predicting and mediating the data. Algorithms and profiles act
to tell all and know how to control the past, present and future of individuals . . . The
cross-referencing of the data categories will project, simulate, and anticipate the profiles
that correspond to “real” bodies for surveillance, care, treatment, information, consumer
deals, including those on or excluded from marketing lists, direct marketing, and public
campaigns to prevent risk.5
The Orkut case might be generalized to other SNSs. In sum, consumers receive the service for free. The SNS company is paid by advertisers, as in broadcasting, but user reception is active, unlike broadcasting. Users insert their information into the SNS, which then
fits users’ information into categories matched to their other databases. This means the
audience commodity can be extremely segmented for sale to the advertisers. SNSs make
the work of company employees easier by helping them produce the statistics, interfaces,
algorithms, and other mechanisms that compose their audience commodity.
The Internet as an Accumulation Platform:
The Case of SNSs
What makes the capital accumulation process for the Internet different from broadcasting is precisely the way it acquires the audience commodity. Television advertisers
buy statistics about potential viewer attention to advertisements, a passive audience
model. Internet companies instead may offer and refine information collected from an
active audience when users spontaneously provide data about their personal tastes,
preferences, desires, and pathways through their browsers (see also Pariser 2012).
Internet advertisers thus can more accurately target the audiences they intend to reach.
We are not affirming that this is the only model of capital accumulation on the
Internet. Many different kinds of business organizations and models coexist with
many other forms of communication that are not necessarily mercantile-based. In the
case we are discussing, however, the final consumer does not pay anything; every
product or service offered by the companies are financed by a third party, the advertiser, who buys the audience commodity obtained in this business model, also known
as “the club logic” (Tremblay 1997). Under these conditions, the concept of audience
Downloaded from tvn.sagepub.com at NATL TAIWAN UNIV of Sci and Tech on December 27, 2014
Bolaño and Vieira
57
commodity—which was usually linked to the broadcasting system—now reemerges
in this three-way mediation structure on the Internet. In analyzing SNSs, this structure
is essential to evaluate the contemporary competencies among the companies that use
the Internet as a business platform for buying and selling information, not just as a
simple tool. Of course there is more of this story to be told. Significant moves among
mobile phone networks and hardware sectors will lead to important changes in the
near future, but that is beyond the scope of this article.
Fuchs (2012a, 2012b, 2012c) also highlights the transformation of the users as
audiences. His concept of audience commodity, however, follows an old definition,
proffered first by Dallas Smythe, and different from ours, as revealed in one crucial
excerpt:
Due to the permanent activity of the recipients and their status as prosumers, we can say
that in the case of Facebook and the Internet, the audience commodity is an Internet
prosumer commodity (Fuchs 2012a, 711).
In Fuchs’ capital accumulation model for SNSs, any user activity, as well as any content eventually produced by them, is only of interest to the Internet company as a raw
material that, then, informational workers produce as the audience commodity and sell
to advertisers. The workers are the only ones to produce economic value by refining
users’ data through software, algorithms, and other intellectual tools. Fuchs proposes
here that the users’ activity does not produce exchange-value. Instead, he argues that
Internet companies exploit SNS users in two ways. First, companies mine userproduced content as raw material for its search engine’s cataloging system. Without
“free” content generated by the users, Google would never be able to retrieve its search
results. Second, the companies’ surveillance of users’ browsing habits in either the
search engine or via SNSs is based on users’ tacit permission to allow these companies
to track, stockpile, and manipulate the information derived from usage.
Actually Fuchs is identifying a more unique process, perpetuated by companies’
most powerful mass subjectivity-capture search engines. Above all, this process is not
a kind of exploitation, or even the two kinds of exploitation that Fuchs claims. Search
engines use not only the information produced by users at no charge to them but also
the information contained in their browsing traces. These produce, at the end of the
process, the audience commodity. Thus, there is no productive work in the actions of
what the author calls “prosumer.” The following excerpt highlights the author’s error:
Google does not pay the users for the production of content and transaction data. Google’s
accumulation strategy is to give them free access to services and platforms, let them
produce content and data, and to accumulate a large number of prosumers that are sold as
a commodity to third-party advertisers. (Fuchs 2012b, 45)
If SNSs followed a purely mercantilist logic with price exclusions, as in the case of
cable TV, users would pay for access to the service. Nevertheless, Google would not
have to pay users for the product that is offered to them, because the adopted financing
Downloaded from tvn.sagepub.com at NATL TAIWAN UNIV of Sci and Tech on December 27, 2014
58
Television & New Media 16(1)
model consists of selling the audience commodity, just as in the case of standard
broadcast television. The main difference between the latter two, as previously stated,
is that the audience commodity is composed of the users’ information. Smythe also
made a mistake in arguing that people watching TV were working for the advertisers,6
but now this mistake returns in Fuchs’s tour de force.
The productive labor in SNSs is precisely the work done by employees, engineers,
researchers, and much other kind of professionals that produce statistics, interfaces,
and algorithms that make possible the constitution of the commodity audience. The
content produced by the users is simply the raw material necessary for that job role.
Fuchs cannot see the problem of commodity duplicity, which is intrinsic to culture
industries. So, he affirms that with the SNSs:
Not a product is sold to the users, but the users and their data are sold as a commodity to
advertisers. Google’s services are not commodities. They are free of charge. The
commodity that Google sells is not Google services (like its search engine), but the users
and their data. (Fuchs 2012a, 45)
When Fuchs says that Google services are free of charge, he does not consider the role
of the advertisers. So, although they are free of charge for the user, someone else is
paying for them.
What really occurs is more complex. The user receives the SNS service for free
because there is a “third-payer” (tiers payant in French) that finances the process.
Individuals do not pay, in other words, because advertisers pay for the process, also called
“indirect commoditization” by Herscovici (2009, 9). In this case, the server (human or
electronic) plays the central role and negotiates the rights of circulation through elaborating the marketing strategies and offering the products or services in exchange for a subscription (Tremblay 1997). At the same time, as we have argued, the audience is also
produced as a commodity, with its own exchange-value, specific use-value, just as it was
in the old broadcasting industry model. What is sold by Google, by the way, is not the
users themselves, as Fuchs proposes in the above excerpt, because the advertiser does not
buy any individual users or even their singular information. Advertisers buy only an
amount of data about a target audience based on categories, as we have outlined.
Looking ahead, we agree with O’Reilly (2006) that the company that is capable of
targeting a critical mass of participant-users and is able to transform it, or the information the users generate to be more precise, will be the winner. The capacity to invest
directly in the personalization/relevance binomial is crucial to keeping competitive in
this market, because “For marketers, more data could mean getting closer to the ultimate goal of advertising: Sending the right message to the right consumer at the right
time” (Sengupta 2013, 2).
Conclusion
During the 1990s, the liberalization and restructuration of both the Internet and the
telecommunications industry in accordance with the project to create a global
Downloaded from tvn.sagepub.com at NATL TAIWAN UNIV of Sci and Tech on December 27, 2014
Bolaño and Vieira
59
informational infrastructure started a new phase of commercialization which enabled
the exponential rise in the number of corporations that sold infrastructural products
and basic network services for the web. This phase came to end when the Internet
bubble popped. The recent wave of market concentration has left a small oligopoly.
Yet the social logic for cultural industries is the same as before. Industries devise
innovative services that can reach a massive number of viewers or users to amuse
themselves, and to relinquish, at the same time, their personal information for the
databases that are really responsible for corporate profits. In recent years, competition
within this logic honed in on SNSs as the newest extension of this process. Google and
Facebook are the biggest exemplars of corporations that, through tracking and collecting information, today are transforming collective subjectivity into profits.
In this article, we summarized the particularities of the political economy of SNSs
and its similarities and differences in relation to broadcasting television’s economic
model. The production of the audience commodity is the permanent anchor of both
systems. Fuchs perceived this well, but his theoretical reading suffers from the same
disabilities that we have seen in the foundations of an Anglo-American agenda for a
critical political economy.
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.
Funding
The authors received no financial support for the research, authorship, and/or publication of this
article.
Notes
1. We may distinguish the network (as known as the Internet nowadays) from World Wide
Web (WWW). The former is the technical support. The latter is only the interface created
in 1999. The web was a milestone in Internet history because it allowed ordinary people to
access the network.
2. We use here the French School conception of regulation.
3. See the Yahoo! Timeline at http://online.wsj.com/news/articles/SB1000142405297020351
3604577140950546379684
4. There is an extensive discussion among Marxism’s core followers since Baran and Sweezy
(1966), as interpreted by Smythe (1981), about the role of advertising in the process of
capital reproduction. Here we follow the position taken by Bolaño (2000).
5. Here we are certainly building on approaches based on other theorists, such as Foucault
and Deleuze, but our approach is strictly Marxist. As such, we consider any kind of technological development in capitalism useful to exploitation and domination systems in a
contradictory way. The Internet is an example of this. On one hand, it is a large structure
for horizontal communication, allowing many social movements to act. On the other hand,
we cannot see this positive feature unilaterally with an optimistic or relativistic eye, which
would see technological development as neutral.
Downloaded from tvn.sagepub.com at NATL TAIWAN UNIV of Sci and Tech on December 27, 2014
60
Television & New Media 16(1)
6. There is no doubt that Smythe deserves the credit for formulating fairly, for the first time,
the question about communication in the Marxist field, which led to the first school of
Political Economy of Communication stricto sensu in the world. Nevertheless, his solution
has a mistake that is well known in Anglophonic literature. In Ibero-American field, see
Bolaño (2000). The Spanish edition of this book was published by Gedisa, in Madrid, in
2013 and the English edition is to be published.
References
Aamoth, Doug. 2008. “Microsoft Calls Yahoo! Decision ‘Unfortunate.’” Techcrunch, February 12.
http://techcrunch.com/2008/02/12/microsoft-calls-yahoo-decision-unfortunate/ (accessed
March 23, 2013).
Arrington, Michael. 2009. “Friendster Valued at Just $26.4 Million in Sale.” Techcrunch,
December 15. http://techcrunch.com/2009/12/15/friendster-valued-at-just-26-4-million-insale/ (accessed August 13, 2013).
Baran, Paul, and Paul Sweezy. 1966. Monopoly Capital: An Essay on the American Economic
and Social Order. New York: Monthly Review Press.
Barnett, Emma. 2011. “MySpace Loses 10 Million Users in a Month.” The Telegraph, March
24. http://www.telegraph.co.uk/technology/myspace/8404510/MySpace-loses-10-millionusers-in-a-month.html# (accessed May 9, 2012).
Bolaño, César. 2000. Indústria Cultural, Informação e Capitalismo [Trans = Cultural Industry,
Information and Capitalism]. São Paulo, Brazil: Hucitec/Polis
Bolaño, César, Alain Herscovici, Marcos Castañeda, and Daniel Vasconcelos. Volume 1. 2011.
Economia Política da Internet [Political Economy of the Internet]. Aracaju, Brazil: UFS.
Bolaño, César, Valério Brittos, Fábio Moura, Paulo V. Menezes, and Eloy Vieira. Volume 2.
2013. Economia Política da Internet [Political Economy of the Internet]. mimeo. Aracaju,
Brazil: UFS.
boyd, danah m., and Nicole B. Ellison. 2007. “Social Network Sites: Definition, History, and
Scholarship.” Journal of Computer-Mediated Communication 13 (1): 210–30. http://jcmc
.indiana.edu/vol13/issue1/boyd.ellison.html (accessed May 9, 2012).
Braz, Rodrigo. 2011. “O Lugar do Simbólico no Capitalismo: Uma Análise das Mudanças
que Estão acontecendo na Passagem do Modelo Fordista-keynesiano Para o Toyotistaneoliberal.” Paper presented at I Congresso Mundial de Comunicação Ibero-Americana,
São Paulo, Brazil.
Bruno, Fernanda. 2006. “Dispositivos de Vigilância no Ciberespaço: Duplos Digitais e
Identidades Simuladas [Survaillance devices in Cyberspace: Double Digitals and Simulated
Identities]” Revista Fronteiras – Estudos Midiáticos 8 (2): 152–59. http://www.revistas
.univerciencia.org/index.php/fronteiras/article/view/3147/2957 (accessed May 12, 2012).
Dybwad, Barb. 2009. “Friendster’s Fate: Sold to Malaysian E-commerce Giant.” Mashable,
December 10. http://mashable.com/2009/12/09/friendster-deal-final/ (accessed May 8,
2012).
Fuchs, Christian. 2011. “Social Medium or New Space of Accumulation?” In The Political
Economies of Media: The Transformation of the Global Media Industries, edited by
Dwayne Winseck and Dal Yong Jin. http://fuchs.uti.at/wp-content/uploads/PEI.pdf
(accessed February 23, 2013).
Fuchs, Christian. 2012a. “Dallas Smythe Today -- The Audience Commodity, the Digital
Labour Debate, Marxist Political Economy and Critical Theory.” Triple C: Communication,
Capitalism & Critique 10(2): 692-740. http://www.triple-c.at/index.php/tripleC/article/
view/443 (accessed February 23, 2013).
Downloaded from tvn.sagepub.com at NATL TAIWAN UNIV of Sci and Tech on December 27, 2014
Bolaño and Vieira
61
Fuchs, Christian. 2012b. “Google Capitalism.” Triple C: Communication, Capitalism &
Critique 10 (1): 42–48. http://www.triple-c.at/index.php/tripleC/article/view/304/330
(accessed February 23, 2013).
Fuchs, Christian. 2012c. “The Political Economy of Privacy on Facebook.” Television New
Media 13 (2): 139–59. http://fuchs.uti.at/wp-content/uploads/polec_FB.pdf (accessed
February 22, 2013).
Google. 1999. “Google Receives $25 Million in Equity Funding.” June 7. http://googlepress
.blogspot.com.br/1999/06/google-receives-25-million-in-equity.html (accessed October 18,
2013).
Herscovici, Alain. 2009. “Contribuições e Limites das Análises da Escola Francesa, à Luz do
Estudo da Economia Digital. Uma Releitura do Debate dos Anos 80 [Contributions and
Limits of the French School’s Aalysis of Digital Economy Studies: A Reinterpretation
of the 1980’s Debate].” Revista Eletrônica Internacional de Economia Política da
Informação, da Comunicação e da Cultura. http://www.seer.ufs.br/index.php/eptic/article/
view/152/127 (accessed March 25, 2013).
Monteiro, Arakin Q. 2008. “Orkut, Subjetividade Coletiva e Valor: Considerações Preliminares
[Orkut, Collective Subjetivity and Value: Preliminary Considerations].” Revista Eletrônica
Internacional de Economia Política da Informação, da Comunicação e da Cultura. http://
www.seer.ufs.br/index.php/eptic/article/view/181/160 (accessed February 20, 2012).
O’Reilly, Tim. 2006. “O que é Web 2.0-Padrões de Design e Modelos de Negócios Para a Nova
Geração de Software [What is Web 2.0? -- Design Standards and Business Models for
the Next Generation of Software].” http://oreilly.com/web2/archive/what-is-web-20.html
(accessed February 22, 2014).
Pariser, Eli. 2012. O Filtro Invisível [The Filter Bubble]. Rio de Janeiro, Brazil: Zahar.
Recuero, Raquel. “Redes Sociais na Internet, Difusão de Informação e Jornalismo: Elementos
para discussão.” http://www.pontomidia.com.br/raquel/artigos/artigoredesjornalismorecuero.pdf (accessed March 9, 2013) [Also published in Metamorfoses Jornalísticas 2: A
Reconfiguração da Forma. Edited by Demétrio de Azeredo Soster and Fernando Firmino.
Santa Cruz do Sul: UNISC.].
Segall, Laurie. 2011. “News Corp. Sells Myspace to Specific Media.” CNN Money, June 29.
(accessed October 17, 2013).
Sengupta, Somini. 2013. “What You Didn’t Post, Facebook May Still Know.” The New York
Times, March 25. http://www.nytimes.com/2013/03/26/technology/facebook-expands-targeted-advertising-through-outside-data-sources.html (accessed April 5, 2013).
Smythe, Dallas W. 1981. Dependency Road. Norwood, NJ: Ablex.
Tremblay, Gaëtan. 1997. “La Théorie des Industries Culturelles Face au Progrès de la
Numérisation et de la Convergence.” Sciences de la Société 40:11–23.
Author Biographies
César R. S. Bolaño is a professor at Federal University of Sergipe, president of the LatinAmerican Communication Researchers Association, director of the journal EPTIC Online and
its network, and founder of the Political Economy of Information, Communication and Culture
Latin Union (ULEPICC).
Eloy S. Vieira is a journalist on the Internet’s political economy and researcher at the Economy
and Communication Observatory (OBSCOM)—a group coordinated by César Bolaño at the
Federal University of Sergipe.
Downloaded from tvn.sagepub.com at NATL TAIWAN UNIV of Sci and Tech on December 27, 2014
552908
research-article2014
TVNXXX10.1177/1527476414552908Television & New MediaMayer
Special Section: New Beginnings
Old Milestones and New
Beginnings
Television & New Media
2015, Vol. 16(1) 72­–76
© The Author(s) 2014
Reprints and permissions:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/1527476414552908
tvnm.sagepub.com
Vicki Mayer1
Abstract
This is a summary of milestones, both for the journal and editorial board members,
as well as the start of a coeditorship for the journal. A look back at Daniel Schiller’s
book Theorizing Communication supports the coeditors’ conception of a media studies
that broadly understands communication as labor.
Keywords
communication, editorial board, editorship, media studies
Editing Television & New Media (TVNM) has been a joy these past four years. The
alacrity with which authors, editorial board members, special issue editors, and reviewers have given their free labor toward a collective project that encompasses media
studies in a global and interdisciplinary way has been inspirational for me. Since coming on board in 2011, the journal has published nine special issues and has collected
eight more special sections of journal issues. Those have covered broad topics ranging
from informal media economies to civic media cultures, as well as current events, such
as the Murdoch wire-tapping scandals and the recent World Cup. Along the way, the
journal has been added to the Thomson Reuters Citation Index and increased from six
to eight issues a year. I believe the increasing scale of the journal reflects the diversity
of its scope and the efforts of its supporters.
To continue this vision, Diane Negra has graciously agreed to be coeditor of the
journal for the next three-year term. Together, we hope to continue to enhance the
quality and scope of TVNM content by following the rhizomatic paths of current media
scholarship and by digging new inroads in related critical fields. Our own explorations
of media structures and political economies, contents and representations, producers
1Communication
Department, New Orleans, LA, USA
Corresponding Author:
Vicki Mayer, Communication Department, 219 Newcomb Hall, Tulane University, New Orleans,
LA 70118, USA.
Email: tvnmeditor@tulane.edu
Downloaded
from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
Downloaded from
http://www.elearnica.ir
73
Mayer
and audiences have been complementary. At the same time, our strengths also reflect
our respective training (me in communication and Diane in film and television studies)
and our separate academic locations (me in the United States and Diane in Ireland).
In thinking about the interdisciplinary and international origins in media studies, I
want to further pause to dwell on the contributions of Graeme Turner, Horace
Newcomb, and Daniel Schiller. These are three editorial board members who were
founding members of the journal and whose work added to the scope of the field. I
have asked Gerald Goggin, Alisa Perrin, and Tom Schatz, as well as making a choice
myself of a book that each of us felt was a milestone for the study of media, even if did
not get its proper due at the time. Founding TVNM Editor and international humanities
historian Toby Miller rounds out this book review tribute to three exceptional and
path-breaking scholars by contextualizing their work in terms of a multifaceted media
studies. I will begin this special section with a look back at Daniel Schiller’s (1996)
book Theorizing Communication, a landmark text in furthering a media studies that
broadly understands communication as labor.
Media Studies as Communication
Defining communication studies, even to my colleagues, can be daunting. Including
symbols and structures, processes and practices, individuals and institutions, nearly
anything can fall within the discipline’s parameters, running the risk of it seeming like
a discipline about nothing. The range of U.S. educational institutions, with their own
divisions and subsequent naming rituals, does not help matters. Even a cursory glance
at my own bookshelf seems like study of confusion, from the books on mind and
action to the tomes on telecommunications policy. The missing link connecting these
two areas, and thus the fragments of communication, is labor.
In Theorizing Communication: A History, Schiller (1996) illuminates how communication became narrowly obsessed with media power and woefully ignorant of the
human activities that ascribed power to media. Those activities, aka labor, slowly
seeped from a study of communication grounded in holistic social theories into one
concerned with either instrumental improvement or asocial abstractions. With the first
case, the combined efforts of ruling experts embedded in government, industry, and
academic institutions look to media as tools for enacting better social control, improved
political responses, or increased consumerism. With the second case, media simply
elide labor struggles when they exist as expressions of the public, users, common culture, audiences, and other totalizing expressions of community. In both cases, the real
social struggles that resulted in the current political economy are absent. As a result,
communication studies generally assume the prevailing social inequalities embedded in
media infrastructures as the norm and thus do not consider the possible alternatives.
I actually never read this book when it was first published. I explored its arguments,
though, as part of my first-year graduate seminar in communication history. Dan sat at
the head of the table with what looked like manuscript chapters neatly laid out. Coming
back to the book, I can see how radical his argument is in the context of media studies
today.
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
74
Television & New Media 16(1)
What made this book so important to me was the way it brought together cultural
criticism and critical political economy in theorizing communication as an integrated
study of the self and structure. Dan’s argument is admittedly U.S.-centric in that he
states the need for a historical tradition in the face of a “field parochially content with
Merton’s middle-range theorizing” (Ibid., viii). Dan focuses his central argument on
how U.S. communication studies began by relegating self-directed human activity into
two camps: mental thinkers and manual labor. The first camp included journalists and
salespeople, academics and managers. The second camp included the trades and
craftspeople who in the nineteenth century supported the idea of common carriers for
news and information, and opposed “corporate efforts to enclose creative endeavor”
(Ibid., 16). Production studies take note. More than a century ago, John Dewey
researched the notion that mental and manual labors were mutually constitutive of
human action, promising a radically democratic basis for understanding creativity
(Ibid., 30). His abandonment of this line, and subsequent arguments for a means of
“organized intelligence” to oversee mental production and secure democracy (Ibid.,
32), signified a momentous shift as U.S. communication scholars became more interested in their own expertise and less interested in the other workers whose exploited
labor maintained the privileged status of what would become known as the creative
class.
As the capacity for organized self-activity continued to be identified overwhelmingly
with the capitalist class and its deputies in and around the giant corporation, historically
unfolding class relationships between capital and labor tended to be conspicuous chiefly
by their absence. (Ibid., 81)
Not surprisingly, by separating mental and manual labor, communication theory has
focused on the individual and professional capacities to create media as art, entertainment, news. Similarly the fruits of mental labor have had more value than manual
labor, such that access to media content takes precedence over the collective ownership of media industries or the redistribution of its gross over profits. Triggers for the
root causes of social ills—the monopolization of property regimes, the exclusion of
the majorities from the wealth they help generate, and the repression of class resistance—are subordinated in media studies to an unflagging focus on better contents or
technological fixes that provide more channels for inclusive representation, for public
access and sharing, even resisting information. The bitter irony of calling mental labor
“unproductive” throughout much of the twentieth century has been the way postindustrial theorists, such as Daniel Bell, could recuperate mental labor as the source of
profits in the information society. The untethering of information, representations, and
symbolic goods from all other goods and their production chains has made it easier to
reify them, and their technical–professional creators, as uniquely special in postindustrializing political economies.
The bifurcation of mental and manual labor left other lacunae in theorizing communication media. Textbook histories of communication media have tended to be little
more than a parade of successful communication technologies or a progressive account
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
75
Mayer
of markets and democracy, with each one reinforcing the other. Yet Dan’s account of
the fierce labor struggles around the telegraph reminds readers of economic imperatives built into new technologies and the political battles to transform them. The eventual institutionalization of the telegraph as a natural monopoly that favored private
profiteering based on the initial pursuit of the wealthiest consumers should be included
in every history of communication media, both as an object lesson in corporate enclosure in the face of public opposition and as a story that will replay with the telephone
and the Internet.
Although this book was written perhaps before some of his best known and most
cited work on information and the digital economy, such as Digital Capitalism:
Networking the Global Market System (MIT Press 1999) or How to Think about
Information (University of Illinois Press 2006), Theorizing Communication sets the
stage for thinking about information as a product of human labor and its organization.
It also ends up being prescient of the cutting edges for critical communication research.
The study of media labor and production today are hardly absent from communication
conferences, as is a sustained critique of the commoditization of information as a product that is not only like other goods but entwined with their production and distribution. The situated study of these topics in relation to different geopolitics and class
struggles has reintroduced the importance of theorizing communication more broadly,
while not losing sight of the specificities of historical time and place. A troubling division continues in the separate study of (laboring) producers and (not-laboring) audiences, “reproducing the very dichotomy between consumption and production that is
institutionalized by the culture industry” (Ibid., 194). But even that story is on the
horizon for those who have focused critically on free labor across numerous industries
and media.
Finally, I like this book because of the way it models to critical communication studies how to avoid sweeping generalizations about the discipline’s own origins. Dan is
highly attuned to the theoretical points of consensus and differences between theorists,
even those who were diametrically opposed in their politics. He also recoils against
scholarship built on straw-men propositions, such as those that reduce the critique of
cultural imperialism to a simple rejection of foreign imports (Ibid., 89). Dan’s careful
reading of theory seems repeatedly salient when I read a monolithic rendering of either
an intellectual formation, such as “second-wave feminists,” or a supposed battle, such as
the one rehashed between cultural studies and political economy, as if these were all
membership clubs with loyalty rewards programs. Instead, Dan demonstrates how so
many central thinkers in those movements—in particular Raymond Williams and Stuart
Hall—were involved in ongoing discussions, putting forth new propositions. Although I
may not agree with each element of his argument, Dan’s book makes me appreciate how
the study of communication could go deeper into theorizing what it actually is.
Declaration of Conflicting Interests
The author declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
76
Television & New Media 16(1)
Funding
The author received no financial support for the research, authorship, and/or publication of this
article.
References
Schiller, Dan. 1996. Theorizing Communication: A History. New York: Oxford University
Press.
Schiller, Dan. 1999. Digital Capitalism: Networking the Global Market System. Cambridge,
MA: MIT Press.
Schiller, Dan. 2006. How to Think about Information. Chicago, IL: University of Illinois Press.
Author Biography
Vicki Mayer is Editor of the journal.
Downloaded from tvn.sagepub.com at Hacettepe Univeristy on December 27, 2014
552909
research-article2014
TVNXXX10.1177/1527476414552909Television & New MediaGoggin
Special Section: New Beginnings
Graeme Turner’s
Television Books
Television & New Media
2015, Vol. 16(1) 77­–85
© The Author(s) 2014
Reprints and permissions:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/1527476414552909
tvnm.sagepub.com
Gerard Goggin1
Abstract
This paper marks the contribution of Graeme Turner, an important figure in television
studies. It argues that over three decades, Turner’s work makes various significant
contributions to our understanding of television as a broader facet of culture. He does
so, through three key collaborative books—Australian Television (1989), The Australian
TV Book (2000), and Locating Television (2013), each representing a particular moment
and response to a conjuncture of television. In particular, Turner’s work offers a
clear sense of how to grasp, analyze, and critique the transformations associated with
television’s new media dynamics.
Keywords
Graeme Turner, television, national television, international television, cultural
studies, Australian cultural studies
Often there is a touchstone book or two that marks and encapsulates a scholar’s influence upon the field. Graeme can lay claim to more of these books than most, especially
as a sole author. Graeme often achieved these through lucid, witty, and often trenchant
textbooks, published in multiple editions. His British Cultural Studies was a key text
in defining cultural studies (Turner 1990) and went into three editions. Film as Social
Practice (Turner 1988), in four editions, was a widely set book on cinema and film
studies, a favorite of Meaghan Morris, who sees it as “written for students, not for the
politically monitoring gaze of colleagues . . . a book for people who want to learn
directly about film and society” (Morris, forthcoming). Turner’s books National
Fictions (1986) and especially Making It National (1994; see discussion of this latter
book in Miller, forthcoming) were instrumental in the forging of a distinctive and
1University
of Sydney, Australia
Corresponding Author:
Gerard Goggin, Department of Media and Communications, Level 2, Woolley Building A20, University of
Sydney, Sydney, NSW 2006, Australia.
Email: gerard.goggin@sydney.edu.au
Downloaded
from tvn.sagepub.com at University of Haifa Library on December 27, 2014
Downloaded from
http://www.elearnica.ir
78
Television & New Media 16(1)
sophisticated approach to studying the national—in this case, Australia. For a long
time, and still, these and other books made him a key figure in Australian studies. This
critical and political investment in understanding the national drew controversy.
However, reconfigured and rethought, the national remains crucial to Turner’s work,
especially on television.
For this medium, there is no obvious book that serves as an entry point to understanding Turner’s impact on the field. Rather, television features across many (if not
most) of his books in which it is not necessarily the central subject. It does so not only
in some books I have already mentioned but also in books such as Ending the Affair:
The Decline of Television Current Affairs in Australia (Turner 2005) and Ordinary
People and the Media (Turner 2010).
Yet the field of television studies would be poorer if it were not for Turner’s central
collaborative contributions via three key books, each representing a crucial moment in
the development of television and television studies. These are Australian Television:
Programs, Pleasures and Politics (Tulloch and Turner 1989), The Australian TV Book
(Turner and Cunningham 2000), and Locating Television (Pertierra and Turner 2013).
Given the recent nature of Locating Television, and the fact that it is very likely better
known among an international audience, this piece focuses on the first two of these
books.
National Television: Australian Television (1989)
Australian Television was the third title in the series Australian Cultural Studies,
edited by John Tulloch under the imprint of leading Australian publisher Allen &
Unwin (established in 1976 as a subsidiary of British firm George Allen & Unwin,
becoming independent in 1990).1 What makes Australian Television so interesting to
the assessment of Turner’s work is that it encapsulates this distinctively generative
matrix of television studies. The collection is strongly engaged with the state of the art
of television studies internationally, at a rich moment of its definition. At the same
time, the distinctively Australian thinking through of television “texts produced for
and transmitted through Australian television” is precisely the point as “British or
American readers which employ unfamiliar examples of television programs and
inappropriate models of cultural relations” (Tulloch and Turner 1989, xi, “Preface”).
In doing so, the editors see a “range of theoretical traditions” being an important, as
well as comprehensive, approach—and one that eschews the dominant ways of seeing
television as both “trivial and powerful” (Tulloch and Turner 1989, xii).
A measure of the significance of the Australian Television volume can be found in
how it brings together an impressive group of scholars who were influential in the
formation of cultural and media studies internationally and in distinctively Australian
cultural studies. Lead editor Tulloch was one of a remarkable number of influential
British scholars living and working in Australia at this time. Nowhere was this dual
Australian/international coupling more apparent in cultural studies than in the area of
television. The received golden age myth of this time involves the fact that a significant number of these were based in the Perth, Western Australia, tertiary institutions:
Downloaded from tvn.sagepub.com at University of Haifa Library on December 27, 2014
79
Goggin
John Fiske, John Hartley, Graeme Turner, and then Toby Miller, also. By the end of
the 1980s, Turner was settled in Brisbane, Queensland, first at Queensland University
of Technology, then at the University of Queensland. These expatriate contributors
were truly matched by the Australian scholars, including Tom O’Regan, Stuart
Cunningham, and the unsung pioneer of Australian cinema and television studies
Albert Moran.
Against this dense crisscrossing of scholars and their joint ventures, Turner’s
chapter is pivotal to the conception of the Australian Television volume and gives us
a clear glimpse of his preoccupations as well as sketching the trajectories of his subsequent television research. Titled “Transgressive TV: From In Melbourne Tonight to
Perfect Match,” it aims to parley pleasure and risk into the scene of television scholarship—against the backdrop of the “dominant point of view within academic discussions of the media” (Turner 1989, 26) that sees TV as “entirely hegemonic, inevitably
reproducing dominant views and attitudes to and for its audiences” (Turner 1989, 28).
Turner’s interest lies in television’s interplay between predictability, formula, and
ritual, on one hand, and the “break with the normal and the transgression of conventions” (Turner 1989, 27), on the other. He sees this productive tension occurring in
programs such as The Young Ones, Soap, Moonlighting, and the Australian classic Hey
Hey It’s Saturday (a variety show, which I grew up on). Picking up on Australian film
critic and scholar Adrian Martin’s idea of “stretch TV” (Martin 1985), Turner argues
that these programs involve a dynamic where
. . . a complicated relationship is set up between a formula that is, on the one hand,
familiar, predictable and largely observed and, on the other hand, that formula’s deliberate
subversion, the suggestion of a “real multiplication of possibilities.” (Turner 1989, 27)
Graeme sees this phenomena as a “kind of performance—a spectacle of pure TV”
(Turner 1989). He suggests that the “distinctiveness of these transgressive, ambiguous, television texts lies in the degree to which they transgress their own conventions
and thus invite a range of possibly contradictory response from their audiences”
(Turner 1989, 29 ). In making this argument, he is aiming to effect a strategic break in
extant television studies from its default tendency to seek out and valorize the aesthetic, reflective moments: “transgressive television works to minimize television’s
reflectiveness, its produced, static and unified aspects (its textuality, even), and its
affinities with aesthetic forms like literature or film” (Turner 1989, 30 ). He also
wishes to be precise in how he accounts for television’s richness and potential for
resistant readings, explicitly against the positions of Hartley and Fiske. He is interested in the “specific pleasures” TV provides, which might offer “resistance to ideological control” or “facilitate social change” (Turner 1989, 35–36). Accordingly,
Turner concludes,
We need to differentiate between various modes of popular television . . . the ratings
achieved by transgressive programs have been telling us for years that TV does not have
to be simple or unified in order to be popular. (Turner 1989, 36)
Downloaded from tvn.sagepub.com at University of Haifa Library on December 27, 2014
80
Television & New Media 16(1)
Central to his argument is an analysis of the ways in which Australian television has
been characterized by the appropriation and adaptation—as we would now probably
term it—of imported program models. Thus, the very popular Australian romantic
game show Perfect Match (the two key presenters pictured on the book’s cover) is a
leading example of shows that
occupy positions of influence and produce affectionate memories as key moments in the
development of locally produced Australian television. It is probably significant that the
programs which excite particular affection are those which were based on, and
outrageously transformed, more benign foreign models: Perfect Match and the innocuous
American show, Blind Date, or In Melbourne Tonight and The Steve Allen Show. (Turner
1989, 34)
In a turn of phrase that recurs as a motif in his later work, Graeme suggests, “We need
to know more about the ideological and social functions of this kind of television”
(Turner 1989, 34).
It is this careful analysis that Graeme uses to address and situate the issue of “political objectives” (Turner 1989) of the kind abroad in much cultural studies work at the
time—but also in relation to the pressing yet complex issue with which he begins his
chapter, and which frames the volume and his entire work. What does the national, and
the local, amount to? And where does it fit into the international and global? Turner
responds that Australia is a postcolonial culture, and as such, the national is still in
dialogue with generic structures and readers’ expectations from long ago and far away.
This becomes an agenda for a television studies that addresses not only its own “specific industrial structures, sets of production practices, and repertoires of productions”
but also “the complex processes of appropriation and transformation of foreign models
that we can hear as the ‘accent’ of Australian television” (Turner 1989, 25). In subtly
different yet consistent terms, this is a governing idea that Graeme richly develops in
the next phase of his work.
Qualifying Television Studies: The Australian TV Book (2000)
Eleven years after Australian Television, there appeared a second edited volume titled
The Australian TV Book (Turner and Cunningham 2000). This is the second book
through which we can decipher Turner’s impact on a second moment in the field of
television studies and its instruction. The Australian TV Book was an effort to provide
a dedicated textbook for school and university courses on television, in particular.
This volume includes authors represented in its 1989 predecessor, notably O’Regan
and Hartley, and Cunningham, who Turner had established a strong editorial collaboration with by this time. In the 1990s, Cunningham had established himself as a moverand-shaker, not least with his espousal of cultural policy as a key dimension of cultural
studies (Cunningham 1992). In addition, newcomers who subsequently made their
mark on national and international television studies included Liz Jacka, Jock Given,
Terry Flew, and Alan McKee.
Downloaded from tvn.sagepub.com at University of Haifa Library on December 27, 2014
81
Goggin
In the opening chapter, Turner argues why a national mapping of television remains
vital:
Australian television programming remains one of the key means through which we can
imagine ourselves belonging to a nation of common interests and experiences. (Turner
2000, 3)
Against this backdrop, Turner gives a thumbnail sketch of the development of television studies, especially the pivotal moments represented by Fiske and Hartley’s (1978)
Reading Television and the development of audience studies. From these two trajectories, Turner signals his agreement with Charlotte Brunsdon’s judgment that television
studies remains “textualized” (Brunsdon 1998), but he also draws a crucial distinction
that the field must attend to the specificity of the location of consumption. Referring
to the issues of regulation, ownership, and control that emerge from cultural policy
studies, Turner writes that the inhabiting of a place is constitutive and reframes one’s
orientation toward, and conception of, the field:
Significantly, while Brunsdon might feel entitled to talk about “television” without any
qualifying indicator of national location, Australian writers are much more likely to talk
about “Australian television.” Australian television studies has been fundamentally
defined by consideration of the specific local or national contexts within which Australian
television is produced and consumed. (Turner 2000, 9)
This is a very interesting and significant passage in a fairly short chapter. It is clear
how its implications unfold in the Australian TV Book, at least in the structure and
roster of the volume—though the actual essays spill over in various directions. I think,
though, it is in the current moment that the germ of the idea is fully worked out. For
the present, it is worth saying a little about the turn-of-the-century context for television studies, the Australian TV Book, and Turner’s research.
Television studies was coming of age, in a certain way—at least, as indicated by the
publication of a number of readers and guides to the field (Geraghty and Lusted 1998;
Miller and Lockett 2002, then, a little later, Allen and Hill 2004; Wasko 2005). In
2000, Horace Newcomb’s long-running anthology, Television: The Critical View,
went into its sixth edition (Newcomb 2000, the first edition), the first edition dating
from 1975. Television itself had been reshaped by a range of overlapping developments through the 1980s and 1990s. The role of the nation was changing, requiring a
shift of perspective. These transformations were the subject of important work by
Australian-based scholars in the 1990s (Cunningham and Jacka 1996; Cunningham
and Sinclair 2000; Sinclair 1999; Sinclair et al. 1996).
It was in 2000 that Graeme established his own research center at the University of
Queensland—Brisbane’s CCCS (Centre for Critical and Cultural Studies). The first
conference hosted by CCCS was a full-scale international affair, devoted to the stateof-the-art of television studies—the first such conference in Australia. Held in early
December 2000, Television: Past, Present and Futures included leading figures in the
Downloaded from tvn.sagepub.com at University of Haifa Library on December 27, 2014
82
Television & New Media 16(1)
field, such as Lynn Spigel, who continued on postconference in a marathon five-day
Masterclass event. Among other things, the Television conference was memorable for
John Fiske’s keynote, his swan song in the year he retired from academic life. In his
keynote, he spoke of fundamental shifts in television’s social function, represented by
TiVo. While introduced in Australia subsequently in 2008, TiVo has never functioned
in the same way as it did in the United States (see Meese et al. 2013). Despite this, at
the University of Queensland Television conference, Fiske’s nigh-prophetic account of
postbroadcast television was compulsive viewing:
[W]hat TIVO does is it enables each person, each individual viewer, to construct their
own TV channel, a TV channel that is precisely aligned to their interests and preferences
. . . this idea of technologising cultural choice is one I think we’ve got to take very
seriously indeed. (Fiske 2000)
I’m not sure that Fiske ever published a version of this paper. Before long, however,
TiVo was a frequently discussed harbinger of the future of television and new media
(for instance, Body 2004; Uricchio 2004; or John Hartley in the foreword to the
twenty-five-year reissue of Reading Television, see Hartley 2003). It is this scene of
television’s digital turn evoked by Fiske, and taken up by many others, which provides
the spur for the current moment of Turner’s work.
Conclusion
In thirteen years to follow after the Australian TV Book and the Television conference,
Turner’s work blossomed in expansive, new, and increasingly international directions.
Inevitably, in this short assessment of Graeme Turner’s television research, I have only
been able to scratch some surfaces—and have only gestured toward the importance of
his most rich, recent, team-based work on postbroadcast television.
My way to indicate some of the dimensions, governing ideas, and dynamics of
Turner on television has been to—doubtless with a degree of arbitrariness—choose
three key books, which I have argued illustrate three prime moments, each responding
to a particular conjuncture. Such a schema tends to have its limitations, as in the third
moment (not elaborated here), when there turns out to be already two books (Pertierra
and Turner 2013; Turner and Tay 2009b;) and a third on the way (Tay and Turner,
forthcoming); in particular, Locating Television (Pertierra and Turner 2013), for many
reasons, most obviously its conception as a full-length study, moves the field on quite
a distance.
In all this, the Australian location of Turner’s work matters a great deal. As the
Australian Television book reveals, there are some happy accidents at play in how
television studies is conjoined in his part of the world in the 1980s. A commitment to,
and constant reminder of, the specificity of Australian television predisposes him to
work against the field of dominant conceptions of television, media, and cultural studies, most recently in his critique of the “digital orthodoxy” (Tay and Turner 2010).
Such dissatisfaction and critique pushes him to find canny and powerful ways to
Downloaded from tvn.sagepub.com at University of Haifa Library on December 27, 2014
83
Goggin
leverage the constraints and resources of his institutional, geopolitical, and intellectual
topography to generate new accounts adequate to the various formations, scales, and
modes of television internationally.
Unusually for a thinker with such a distinctive voice, idiom, style, and cultural politics, Graeme Turner’s important television books are collaborative, including significant edited collections. There are many antecedents for the interplay of individual and
collective in the forging of a powerful body of work or a single book, as well as Turner’s
own integral and widely acknowledged, generative work in collaboration. I would suggest that these dynamics of collaboration are here in subtle and important ways in
Turner’s work on television and how it threads into television studies generally.
While Turner’s books on television are legion, it may not be too much to ask for one
more—a compilation of his articles and chapters scattered across various edited volumes and more or less obscure journals. The warrant for this is of a piece with the
reason for reading him in the first place: to understand television’s histories, present,
futures, and, now, also its diverse locations.
Declaration of Conflicting Interests
The author declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.
Funding
The author received no financial support for the research, authorship, and/or publication of this
article.
Note
1.
Australian Television is now available through Amazon.com, in a relatively inexpensive
Kindle edition.
References
Allen, Robert C., and Annette Hill, eds. 2004. The Television Studies Reader. London: Routledge.
Body, William. 2004. “Interactive Television and Advertising Form in Contemporary U.S.
Television.” In Television after TV: Essays on a Medium in Transition, edited by Lynn
Spigel and Jan Olsson, 113–32. Durham: Duke University Press.
Brunsdon, Charlotte. 1998. “What is the ‘Television’ of Television Studies?” In The Television
Studies Book, edited by Christine Geraghty and David Lusted, 95–113. London: Arnold.
Cunningham, Stuart. 1992. Framing Culture: Criticism and Policy in Australia. Sydney: Allen
& Unwin.
Cunningham, Stuart, and Elizabeth Jacka. 1996. Australian Television and International
Mediascapes. Cambridge: Cambridge University Press.
Cunningham, Stuart, and John Sinclair, eds. 2000. Floating Lives: The Media and Asian
Diasporas. St. Lucia: University of Queensland Press.
Fiske, John. 2000. “Interview with Mick O’Regan.” The Media Report, ABC Radio National,
7 December. http://www.abc.net.au/radionational/programs/mediareport-1999/professorjohn-fisk/3476232#transcript.
Downloaded from tvn.sagepub.com at University of Haifa Library on December 27, 2014
84
Television & New Media 16(1)
Fiske, John, and John Hartley. 1978. Reading Television. London: Methuen.
Geraghty, Christine, and David Lusted, eds. 1998. The Television Studies Book. London: Arnold.
Hartley, John. 2003. “Reading Television after 25 Years: A New Foreword.” In Reading
Television. 2nd ed., edited by John Fiske and John Hartley, ix–xxii. London: Routledge.
Martin, Adrian. 1985. “Stretch TV.” Xpress: Popular Culture 1 (1): 22–23.
Meese, James, Rowan Wilken, Bjorn Nansen, and Michael Arnold. 2013. “Entering the
Graveyard Shift: Disassembling the Australian TiVo.” Television and New Media.
Published electronically October 23. doi:10.1177/1527476413505919.
Miller, Toby. Forthcoming. “Dependencia Meets Gentle Nationalism.” Cultural Studies .
Miller, Toby, ed., and Andrew Lockett, associate ed. 2002. Television Studies. London: British
Film Institute.
Morris, Meaghan. Forthcoming. “Turning up to Play: ‘GT’ and the Modern Game.” Cultural
Studies .
Newcomb, Horace. 2000. Television: The Critical View. 6th ed. New York: Oxford University
Press.
Pertierra, Anna, and Graeme Turner. 2013. Locating Television: Zones of Consumption.
London: Routledge.
Sinclair, John. 1999. Latin American Television: A Global View. Oxford: Oxford University
Press.
Sinclair, John, Elizabeth Jacka, and Stuart Cunningham, eds. 1996. New Patterns in Global
Television: Peripheral Vision. Oxford: Oxford University Press.
Tay, Jinna, and Graeme Turner. 2010. “Not the Apocalypse: Television Futures in the Digital
Age.” International Journal of Digital Television 1 (1): 31–50.
Tay, Jinna, and Graeme Turner, eds. Forthcoming. Television Histories in Asia. New York:
Routledge .
Tulloch, John, and Graeme Turner, eds. 1989. Australian Television: Programs, Pleasures and
Politics. Sydney: Allen & Unwin .
Turner, Graeme. 1986. National Fictions: Literature, Film, and the Construction of Australian
Narrative. Sydney: Allen & Unwin.
Turner, Graeme. 1988. Film as Social Practice. 1st ed. London: Routledge.
Turner, Graeme. 1989. “Transgressive TV: From in Melbourne Tonight to Perfect Match.”
In Australian Television: Programs, Pleasures and Politics, edited by John Tulloch and
Graeme Turner, 25–38. Sydney: Allen & Unwin.
Turner, Graeme. 1990. British Cultural Studies. 1st ed. Boston: Unwin Hyman.
Turner, Graeme. 1994. Making It National: Nationalism and Australian Popular Culture.
Sydney: Allen & Unwin.
Turner, Graeme. 2000. “Studying Television.” In The Australian TV Book, edited by Graeme
Turner and Stuart Cunningham, 3–12. Sydney: Allen & Unwin.
Turner, Graeme. 2005. Ending the Affair: The Decline of Television Current Affairs in Australia.
Sydney: University of New South Wales Press.
Turner, Graeme. 2010. Ordinary Media and the Media: The Demotic Turn. Los Angeles: SAGE.
Turner, Graeme, and Stuart Cunningham. 2000. The Australian TV Book. Sydney: Allen &
Unwin.
Turner, Graeme, and Jinna Tay, eds. 2009a. “Introduction.” In Television Studies after TV:
Understanding Television in the Post-Broadcast Era, 1–8. London: Routledge.
Turner, Graeme, and Jinna Tay, eds. 2009b. Television Studies after TV: Understanding
Television in the Post-Broadcast Era. London: Routledge.
Downloaded from tvn.sagepub.com at University of Haifa Library on December 27, 2014
85
Goggin
Uricchio, William. 2004. “Television’s Next Generation: Technology/Interface Culture/Flow.”
In Television after TV: Essays on a Medium in Transition, edited by Lynn Spigel and Jan
Olsson, 163–82. Durham: Duke University Press.
Wasko, Janet, ed. 2005. A Companion to Television. Malden, MA: Blackwell.
Author Biography
Gerard Goggin is a professor of media and communications and ARC future fellow at the
University of Sydney. He is widely published on new media, especially the social and cultural
dynamics of mobile media and the Internet. Gerard also has a longstanding interest in disability
and media. He worked for Graeme Turner in the Centre for Critical and Cultural Studies,
University of Queensland, from 2002 to 2005.
Downloaded from tvn.sagepub.com at University of Haifa Library on December 27, 2014
552907
research-article2014
TVNXXX10.1177/1527476414552907Television & New MediaPerren and Schatz
Special Section: New Beginnings
Theorizing Television’s
Writer–Producer: Re-viewing
The Producer’s Medium
Television & New Media
2015, Vol. 16(1) 86­–93
© The Author(s) 2014
Reprints and permissions:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/1527476414552907
tvnm.sagepub.com
Alisa Perren1 and Thomas Schatz1
Abstract
Alisa Perren and Thomas Schatz honor television studies scholar and former Peabody
Director Horace Newcomb’s career. The authors illustrate how one of Newcomb’s
less frequently cited books, The Producer’s Medium (coauthored with Robert Alley),
expresses themes central to his larger body of work and serves as a groundbreaking
study of American television, authorship, and industry in its own right. In addition,
they illustrate key ways that the book might inspire contemporary investigations into
convergent-era television.
Keywords
authorship, production, management, showrunner, Hollywood, television
In the essay that follows, we pay tribute to the career and work of preeminent television studies scholar and former Peabody Director Horace Newcomb. We do so by
revisiting one of Newcomb’s early, less frequently cited, publications (with Robert
Alley), The Producer’s Medium (1983), which crystallizes many of the key themes
and concerns throughout Newcomb’s influential body of scholarly work. The essay
situates The Producer’s Medium historically in relation to Newcomb’s other work and
also to the early development of television studies, highlighting the book’s main argument and identifying some of its key contributions to humanities-oriented studies of
the television industry in the early 1980s. The essay then makes the case for the book’s
continued relevance in light of more recent developments in the medium, the industry,
and American culture. The Producer’s Medium offers a valuable jumping-off point
from which to assess the shifting industrial status of the writer–producer as well as
1University
of Texas at Austin, USA
Corresponding Author:
Alisa Perren, Department of Radio-Television-Film, University of Texas at Austin, 2504 Whitis Ave., Stop
A0800, Austin, TX 78712-1067, USA.
Email: aperren@austin.utexas.edu
Downloaded
from tvn.sagepub.com at b-on: 01100 Universidade do Porto on December 27, 2014
Downloaded
from http://www.elearnica.ir
Perren and Schatz
87
broader cultural discourses about the medium. Newcomb and Alley’s watershed study
not only provides a means of understanding television authorship and aesthetics during the classic network era but also serves as a blueprint for later work focused on the
showrunner as a central creative and managerial force in the postnetwork era of
American television.
Since the stunning ascendance of the prime-time network drama in the 1990s with
series such as NYPD Blue (1993–2005) and ER (1994–2009) and the subsequent surge
in original cable series with The Sopranos (1999–2007) and Mad Men (2007–), journalistic and critical discourse about television has focused increasingly on the issue of
authorship. Names such as Steven Bochco, John Wells, David Chase, and Matthew
Weiner steadily came to dominate scholarly and trade press—and eventually popular
discourse as well—heralded as the “showrunners” behind these programs and the
chief architects of this perceived renaissance in television series programming. The
preponderance of such coverage might lead one to presume that the showrunner role
is of recent vintage. Indeed the label is—it first appeared in industry trade publication
Variety in 1990, and in the New York Times in 1995, in a feature on ER executive producer John Wells.1 The position is far from new, however. Also not new is the belief
that the producer’s position represents the key creative role in television series
production.
The Producer’s Medium was not the first book to address this particular role. Over
a decade earlier, sociologist Muriel Cantor (1971), in The Hollywood TV Producer,
provided an extensive study of this position based on interviews with fifty-nine primetime television producers. Although Cantor’s stands as a foundational work, it lacks
the nuance and subtle theorization of Newcomb and Alley’s book, which has proven
to be far more important to the development of television studies. In contrast to
Cantor’s study, which emphasized the impact of social and industrial structures on
producers’ motivations, creativity, and values, Newcomb and Alley (1983, p. xii) highlighted the agency that certain individuals—whom they labeled as “self-conscious
creative producers”—exercised within these structures. In this sense, The Producer’s
Medium was more firmly based in humanistic than in social science traditions, and it
was necessarily an extension of Newcomb’s earlier groundbreaking work in television
studies (see in particular Newcomb 1974, 1976).
The year 1983 was a banner year for Newcomb and for television studies in the
United States, as the more established fields of film studies and communication studies both struggled to accommodate this upstart field and its leading proponent.
Newcomb was on the first panel devoted to television at the annual meeting of the
Society for Cinema Studies (later SCMS). Horace Newcomb and Paul Hirsch (1983)
published their influential essay on television as a cultural forum, and Todd Gitlin’s
Inside Prime Time (1983), an ambitious ethnography of the contemporary television
industry, would become another milestone in the field. After a much-publicized wrangle with Newcomb in the Journal of Communication over the study of “media effects,”
George Gerbner, the dean of the Annenberg School for Communication at the
University of Pennsylvania and an outspoken critic of television violence and commercial television generally, edited a special edition of the journal titled “Ferment in
Downloaded from tvn.sagepub.com at b-on: 01100 Universidade do Porto on December 27, 2014
88
Television & New Media 16(1)
the Field” (see Gerbner 1983). And the National Communication Association
announced the launch of a new journal, Critical Studies in Mass Communication. The
debut issue included another Newcomb (1984) essay.
Although it was a propitious year for Newcomb, he was sorely disappointed at the
time that The Producer’s Medium did not attract more attention and cause more of a
stir. And it should have. The book was indeed overshadowed by Gitlin’s book, but in
our view, despite its low-key, unassuming approach, The Producer’s Medium was a
radical intervention into television studies. Newcomb and Alley’s study of TV producers fashioned a theory of television authorship that offered a blueprint and a prolegomena of sorts for understanding the industrial and creative roles of the showrunner in
contemporary television.
A Theory of Television Authorship
A feat of intellectual “indirection,” The Producer’s Medium presents some fairly serious theorizing in the guise of, as the book’s subtitle puts it, “Conversations with
Creators of American TV.” In fact, the book includes conversations with only eleven
producers, and the interviews comprise only about half of the 250-page book. The rest
consists of a fifty-page introduction on the producer’s role in American commercial
network television, most of it written by Newcomb, along with introductions of ten or
so pages to each of the conversations, briefly surveying the producer’s career highlights, modus operandi, and signature style.
The authorship argument in The Producer’s Medium is developed through three
fairly distinct stages. The first is a summary of the “cultural approach” to television in
the book’s introductory section. Here, the authors invoke James Carey’s “ritual view
of communication,” Victor Turner’s theory of entertainment and “liminality,” John
Fiske and John Hartley’s notion of “bardic” television, and Newcomb’s own work
with Hirsch on television as a public forum for examining and renegotiating our culture’s core—and inevitably conflicted and contradictory—values and beliefs.
The second stage of Newcomb’s argument suggests that the development of a cultural
approach to “the most popular art” has tended to overvalue the popular and overlook the
art—that “the roles of significant individuals have been minimized” and that there has
been “little discussion of the creative nature of much of the work that goes on in the
medium” (Newcomb and Alley 1983, 31). The central creative and administrative role in
the commercial American television workplace was and remains that of the producer—
that is, the individuals or collaborative teams who create series, deal with the networks
and production companies, hire and supervise the creative personnel (including directors, who in series television tend to come and go on a weekly basis), and manage the
overall production.2 This is a vastly complex process, of course, demanding the maintenance of the narrative operation so essential to any successful series.
The primary aim of The Producer’s Medium is to identify and assess “a handful of
producers . . . who have established a place for themselves and their work by successfully reading and responding to the culture, criticizing it and creating new forms within
it” (Newcomb and Alley 1983, 33). Newcomb and Alley are especially interested in
Downloaded from tvn.sagepub.com at b-on: 01100 Universidade do Porto on December 27, 2014
Perren and Schatz
89
producers “who create in the ‘bardic’ center of our shared culture” versus those who
use the medium for individual expression (Newcomb and Alley 1983, 34). Newcomb
terms the latter mode “lyric” television, which is “rooted in subjective response” to the
culture and operates in a voice that is “personal rather than social” (Newcomb and
Alley 1983, 37). He points to Norman Lear’s Mary Hartman, Mary Hartman (1976–
1977) as one example of lyric television and also to Larry Gelbart’s post-M*A*S*H
(1972–1983) portrait of marital discord, United States (1980). Newcomb and Alley are
more interested, conversely, in television’s “choric” function, its capacity to explore
“the central regions of the American mind,” and “the shared systems of meaning and
symbol that form our cultural life” (Newcomb and Alley 1983, 43).
This emphasis on series producers is the third stage of Newcomb and Alley’s authorship argument, although this key component of their theoretical schema does not fully
coalesce until the interviews themselves. An astute rhetorical move in the book is the
way that the coauthors allow their subjects to invoke the term auteur, which they do
often enough. Even more important is the way that the conversations steadily sharpen
the book’s central argument, positing that television is not simply a producer’s medium
but a writer–producer’s medium. All but one of the producers interviewed for the book
learned their craft as writers, became producers on the basis of their achievements as
writers, and regardless of title and screen credit, continued to function as writers. They
either actively scripted their series or, more likely, functioned as supervisors of their
writing staffs, in which capacity they developed story ideas and assigned scripts that
they invariably rewrote or revised before sending into production.
The writer–producer angle becomes steadily more acute in the course of the interviews. The first conversations tend to stress genre, especially melodrama, and working
within the U.S. network “system” to, in the words of one, make “producers out of
writers” (Newcomb and Alley 1983, 58). By the time we arrive at the middle of the
book, its theoretical project is front and center; here, writer–producers Richard
Levinson and William Link define the auteur as “the person who has creative control
and chooses to exercise it” (Newcomb and Alley 1983, 145). A subsequent interview
demonstrates the role of the auteur through the story provided by producer Earl
Hamner, who was a writer and who protected his creative voice through the production
process of the classic American series The Waltons (1971–1981). The final interview
reveals the clearest articulation of Newcomb and Alley’s central theme:
Television is a producer’s medium. Feature movies are a director’s medium, and the
theater is a writer’s medium. There are exceptions, but in general these clichés are true. I
feel the key to a television show is the “writer-producer.” There are other people in
television, called “producers,” but they are not writer-producers. (Garry Marshall in
Newcomb and Alley 1983, 238)
Television Authorship Today
In the postnetwork era, journalists, critics, scholars—and the writer-producers themselves—have continued to push authorial distinctions, in the process elevating some
Downloaded from tvn.sagepub.com at b-on: 01100 Universidade do Porto on December 27, 2014
90
Television & New Media 16(1)
individuals over others. Indeed, contemporary “star” writer–producers such as Jenji
Kohan, Kurt Sutter, Vince Gilligan, and Shonda Rhimes are among the cadre of top
television “creatives” who have been portrayed as extending and in some crucial ways
expanding the roles played by their classic network era predecessors. The explosive
growth of cable, satellite, and streaming television over the past two decades has been
accompanied by a dramatic rise in the number of such high-profile showrunners, who
are regularly identified in the networks’ promotional materials, featured in fawning
journalistic profiles, and celebrated for their ambitious storytelling strategies as well
as their ability to connect with fans via social media. The ascendance of both “cult”
and “quality” television—along with the expanded means by which creatives can
interact directly with viewers and the networks’ need to differentiate (and elevate)
their product amid a glut of content—clearly has contributed to an increased public
awareness of the showrunner figure and, with it, heightened coverage of them.
Remarkably enough, despite the significant increase in both the number and the
visibility of top series showrunners, there have been relatively few studies focused
specifically on this figure or on television authorship more generally since The
Producer’s Medium was published some three decades ago. The studies of TV showrunners that have been published have focused primarily on a particular strand of
“quality” television such as that produced by Lear’s Tandem and Tinker’s MTM
Enterprises.3 Oddly, as Derek Kompare (2011) notes, television studies would be legitimated as a field not by acknowledging the producer-as-author but rather by denying
or eliding authorship altogether through structural or audience-based analyses. To
speak about the producer figure, it seemed, meant to endorse romantic visions of the
author as creative genius or to focus on elevating the aesthetic dimensions of the
medium in the interest of reinforcing cultural hierarchies (Shattuc 2005). Yet this need
not be the case—and nowhere do we have clearer evidence of this than in Newcomb
and Alley’s own work.
Revisiting The Producer’s Medium in the present moment enables a recentering
and reconceptualization of the writer–producer figure in television studies. In fact,
this figure’s status as an intermediary—engaging variably with network and studio
executives, advertisers, above- and below-the-line personnel, critics, journalists,
viewers, and beyond—makes it a rich site for understanding the larger transformations taking place in television as well as broader struggles over cultural power
across media forms. The book provides a baseline for assessing the convergent era.
While their interviews took place during the height of the classic network era (1975–
1979), the publication itself just preceded the incorporation of the country’s “big
three” broadcast networks into larger media conglomerates. Meanwhile, the diffusion of the VCR, cable, and remote control were affecting viewing practices and
altering the medium’s status as a “cultural forum” around which mass audiences
aggregated. These changes are prophesized by Newcomb and Alley; they recognize
the tenuous position of certain creative producers as cultural commentators, noting,
“Cable television, with its ability to survive with smaller audiences, segmented and
targeted according to interests, values, and tastes, pulls away from the choric center”
(Newcomb and Alley 1983, 43).
Downloaded from tvn.sagepub.com at b-on: 01100 Universidade do Porto on December 27, 2014
Perren and Schatz
91
With the possibility of producers functioning in a choral role diminished, we see
these figures often occupying what Newcomb and Alley describe as the “lyrical” position noted above, which involves a producer speaking from a primarily personal place
rather than a social one (Newcomb and Alley 1983, 41). Nowhere is this shift more
apparent than in the work of Lena Dunham and Louis C.K., whose distinctive worldviews have attracted dedicated fans and staunch detractors in equal measure. What’s
more, these television showrunners have done so not only through the medium itself
but across old and new media platforms ranging from Twitter to Time magazine.
Creativity still functions within constraints, but the nature of creativity and the types
of constraints are not the same as they were thirty years ago. Generating podcasts and
DVD commentaries, interacting on social media, attending fan conventions, and
developing webisodes now can all fall under the showrunners’ purview. At the same
time, these figures have been forced to take sides in increasingly contentious labor
battles between the WGA and AMPTP (Banks 2010). Along with their long extant
roles as managers and writers, showrunners frequently function simultaneously as
entrepreneurs, salesmen, celebrities, and brand shepherds across media (Mann 2009).
By returning to—and further fleshing out—Newcomb and Alley’s use of terminology
such as lyric and choric and considering the ramifications of deploying such language
in the present context, we have a fresh means of assessing precisely how the showrunner figures culturally, industrially, and aesthetically.
In tandem with these transformations, The Producer’s Medium also underscores
some striking continuities with the past. Television’s cultural status remains in flux.
Recent popular critics’ elevation of aspects of the medium by comparing it to film
(Newman and Levine 2011) mirrors legitimating practices that took place more than
three decades ago. The actual processes involved in creating a prime-time television
series remain surprisingly similar to the production processes outlined in a contemporary how-to manual for aspiring television writers (Gervich 2008). What’s more, white
males continue to disproportionately dominate fictional prime-time producing positions; women and people of color still struggle to find spots on writing staffs, let alone
become showrunners. On the occasions when members of historically marginalized
groups rise to positions of power, they frequently find themselves pigeonholed by
genre and less recognized by critics (Henderson 2011). To this final similarity, The
Producer’s Medium’s concern with outlining the roles and responsibilities of producers in terms of creative control and social commentary intersects with television studies’ ongoing engagement with issues of identity and cultural power.
Beyond a handful of scholarly studies about particular “quality” and “cult” writer–
producers, the work on contemporary showrunners in convergent-era television
remains limited in subject and scope.4 As such, we might benefit by revisiting and
revising Newcomb and Alley’s work, using it as a starting point for the study of producers’ authority within specific technological, cultural, industrial, and geographical
contexts. Far from “betraying the roots” of TV studies or feeding into the field’s recent
“aesthetic turn,” explorations of the producer figure can be a political move, providing
a means of foregrounding industrial and creative agency as well as addressing a wide
range of tensions and relationships in the media industries. Showrunners regularly
Downloaded from tvn.sagepub.com at b-on: 01100 Universidade do Porto on December 27, 2014
92
Television & New Media 16(1)
participate in dialogues about and between industry and audience, management and
labor, television and the Internet, broadcast and cable, the global and the local. We
would benefit from incorporating their voices—along with Newcomb and Alley’s—
more directly into our analyses of the medium and the field.
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.
Funding
The authors received no financial support for the research, authorship, and/or publication of this
article.
Notes
1. Approximately one hundred references to “showrunner” or “show runner” appear in
Variety throughout the 1990s; more than five hundred references appear from 2000 to
2011. See Wild (1999) for an early example of an extended discussion of the figure.
2. Increasingly, discourses about the showrunner have emerged in other national contexts. For
example, see Cornea (2009); Redvall (2013).
3. For example, see Feuer et al. (1984), Thompson and Burns (1990), Thompson (1995), and
Kubey (2004).
4. Examples of “cult” and “quality” studies of showrunners include Lavery and Thompson
(2002), Pearson (2005), Abbot (2009), and Lavery (2011).
References
Abbot, Stacey. 2009. “How Lost Found Its Audience: The Making of a Cult Blockbuster.” In
Reading Lost, edited by Roberta Pearson, 9–26. London: I.B. Taurus.
Banks, Miranda. J. 2010. “The Picket Line Online: Creative Labor, Digital Activism and the
2007–2008 Writers Guild of America Strike.” Popular Communication 8:20–33.
Cantor, Muriel G. 1971. The Hollywood TV Producer: His Work and His Audience. New York:
Basic Books.
Cornea, Christine. 2009. “Showrunning the Doctor Who Franchise: A Response to Denise
Mann.” In Production Studies: Cultural Studies of Media Industries, edited by Vicki
Mayer, Miranda J. Banks, and John Thornton Caldwell, 115–22. New York: Routledge.
Feuer, Jane, Paul Kerr, and Tise Vahimagi. 1984. MTM ‘Quality Television.’ London: BFI
Publishing.
Gerbner, George. 1983. “The Importance of Being Critical—In One’s Own Fashion.” Journal
of Communication 33:355–62.
Gervich, Chad. 2008. Small Screen, Big Picture. New York: Three Rivers Press.
Gitlin, Todd. 1983. Inside Prime Time. New York: Pantheon Books.
Henderson, Felicia D. 2011. “The Culture Behind Closed Doors: Issues of Gender and Race in
the Writers’ Room.” Cinema Journal 50 (2): 145–52.
Kompare, Derek. 2011. “More ‘Moments of Television’: Online Cult Television Authorship.”
In Flow TV: Television in the Age of Media Convergence, edited by Michael Kackman,
Marnie Binfield, Matthew Thomas Payne, Allison Perlman, and Bryan Sebok, 95–113.
New York: Routledge.
Downloaded from tvn.sagepub.com at b-on: 01100 Universidade do Porto on December 27, 2014
Perren and Schatz
93
Kubey, Robert. 2004. Creating Television: Conversations with the People behind 50 Years of
American TV. Mahwah: Lawrence Erlbaum.
Lavery, David. 2011. “Rob Thomas and Television Creativity.” In Investigating Veronica
Mars: Essays on the Teen Detective Series, edited by Rhonda V. Wilcox and Sue Turnbull,
23–34. Jefferson: McFarland.
Lavery, David, and Robert J. Thompson. 2002. “David Chase, The Sopranos, and Television
Creativity.” In This Thing of Ours: Investigating the Sopranos, edited by David Lavery,
18–25. New York: Columbia University Press.
Mann, Denise. 2009. “It’s Not TV, It’s Brand Management.” In Production Studies: Cultural
Studies of Media Industries, edited by Vicki Mayer, Miranda J. Banks, and John Thornton
Caldwell, 99–114. New York: Routledge.
Newcomb, Horace. 1974. TV: The Most Popular Art. Garden City: Anchor Press.
Newcomb, Horace, ed. 1976. Television: The Critical View. New York: Oxford University Press.
Newcomb, Horace. 1984. “On the Dialogic Aspects of Mass Communication.” Critical Studies
in Mass Communication 1 (1): 34–50.
Newcomb, Horace, and Paul Hirsch. 1983. “Television as a Cultural Forum.” Quarterly Review
of Film Studies 8 (3): 45–56.
Newcomb, Horace, and Robert S. Alley. 1983. The Producer’s Medium: Conversations with
Creators of American TV. Oxford: Oxford University Press.
Newman, Michael Z., and Elana Levine. 2011. Legitimating Television: Media Convergence
and Cultural Status. New York: Routledge.
Pearson, Roberta. 2005. “The Writer/Producer in American Television.” In The Contemporary
Television Series, edited by Michael Hammond and Lucy Mazdon, 11–26. Edinburgh:
Edinburgh University Press.
Redvall, Eva Novrup. 2013. Writing and Producing Television Drama in Denmark: From The
Kingdom to The Killing. New York: Palgrave Macmillan.
Shattuc, Jane. 2005. “Television Production: Who Makes American TV?” In A Companion to
Television, edited by Janet Wasko, 142–53. Malden: Blackwell.
Thompson, Robert J. 1995. Prime Time, Prime Movers: From I Love Lucy to L.A. Law–
America’s Greatest TV Shows and the People Who Created Them. Syracuse, NY: Syracuse
University Press.
Thompson, Robert J., and Gary Burns, eds. 1990. Making Television: Authorship and the
Production Process. New York: Praeger.
Wild, David. 1999. The Showrunners: A Season inside the Billion-Dollar, Death-Defying,
Madcap World of Television’s Real Stars. New York: HarperCollins.
Author Biographies
Alisa Perren is an associate professor in the radio-television-film department at the University
of Texas at Austin. She is author of Indie, Inc.: Miramax and the Transformation of Hollywood
in the 1990s (2012) and coeditor of Media Industries: History, Theory, and Method (2009). She
is also comanaging editor and editorial collective member of the new peer-reviewed journal,
Media Industries.
Thomas Schatz is a professor and director of media studies in the radio-television-film department at the University of Texas at Austin. He is the author of four books on American film,
including The Genius of the System: Hollywood Filmmaking in the Studio Era (1989) and Boom
and Bust: American Cinema in the 1940s (1997). His current book project on conglomerate-era
Hollywood was awarded an Academy Film Scholars grant in 2013.
Downloaded from tvn.sagepub.com at b-on: 01100 Universidade do Porto on December 27, 2014
544973
research-article2014
TVNXXX10.1177/1527476414544973Television & New MediaMiller
Special Section: New Beginnings
Beyond the Barrio—Ladies
and Gentlemen, Let’s Get
Ready for . . . Graeme Turner
and Horace Newcomb
Television & New Media
2015, Vol. 16(1) 94­–97
© The Author(s) 2014
Reprints and permissions:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/1527476414544973
tvnm.sagepub.com
Toby Miller1,2
Abstract
This is a homage to the work of Horace Newcomb and Graeme Turner.
Keywords
Horace Newcomb, Graeme Turner, television, media studies, cultural studies, new
media
It is an honor to be invited to write about two foundational figures of media studies in
Graeme Turner and Horace Newcomb. Their work has invigorated me, and others both
like and unlike me, for decades.
This is so for two reasons. First, the moment I see their names associated with
something, whether it is an interview or a book, I want to read what they have to say.
That is because they are equally scholarly and tendentious. There is always something
new, invigorating, and critical on offer. As Foucault (1985, 8) put it, “There are times
in life when the question of knowing if one can think differently than one thinks, and
perceive differently than one sees, is absolutely necessary if one is to go on looking
and reflecting at all.”
And second, I routinely return to their work, long after encountering it, to reconsider what I thought I had understood. Take their first two monographs. Horace’s TV:
The Most Popular Art (Newcomb 1974) continues to make me ponder seemingly
familiar things anew, as it did when I first read it. Endowed with his distinctive qualities as both a practitioner and an academic, and offering a provocation simply in its
1Murdoch
2Cardiff
University, Perth, Australia
University, UK
Corresponding Author:
Toby Miller, 5A Dartmouth Park Avenue, London, NW5 1JL, UK.
Email: tobym69@icloud.com
from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
DownloadedDownloaded
from http://www.elearnica.ir
95
Miller
title, this book began TV studies, for me as many others. For its part, Graeme’s
National Fictions: Literature, Film and the Construction of Australian Narrative
(Turner 1993) is a remarkable fusion of new ideas and exegetical expertise. It was the
foundational volume of Australian cultural studies when its first edition appeared in
1986.
They are quite different books but have something powerful in common. Each one
offers original research and argument, transcends the banality of the doctoral thesis or
careerist ploy, and reaches out to general readers with clear, incisive prose. So part of
their achievement has been to keep in mind, as it were, the undergrad reader at a second-tier school, as much as, if not more than, the authors’ fellow academic stars. Yet,
this was never done uncritically, as per the tedium of the average U.S. mass communications textbook.
Horace’s monograph positions television drama alongside literature rather than
radio or film, because its “sense of density” explores complex themes in lengthy treatments with slow build-ups and multisequenced sites of character development and
interaction (Newcomb 1974, 256). He makes this claim in the context of an appeal to
the central question for the humanities-based study of television at that time (and still
today): whether it was worthy of textual analysis as opposed to behavioral interrogation or generic condemnation. Both Horace and Graeme put art along a continuum,
rather than consigning some forms of it to the back catalog of unworthy dross and
others to a transcendent pantheon. They take popular pastimes seriously.
National Fictions is also animated by writing for an audience beyond one’s barrio,
in terms of a student and not just a professorial readership. It acknowledges the nation
as a productive, not necessarily a bad, object. This is in some contrast to much of cultural studies, which easily and frequently constructs and constricts itself with a somewhat unreflexive transnational adoration, despite its dependence on nationally based
educational and publishing systems. Graeme recognized that the seemingly damned
concept of the nation was usefully deployed in cultural policy, diasporic and indigenous work, alternative television, minor cinema, and globalization.
So they write well. And then there is the sheer surprise that their ideas can inspire.
I will give just two of many examples.
Sometimes Horace bristles at vulgar “ists” such as myself, but when it comes to
asymmetries of power, he stands up to be counted. Horace first alerted me to the fact
that the United States was an early-modern exponent of anticultural imperialist, pronation-building sentiment. Herman Melville, for instance, opposed the U.S. literary
establishment’s devotion to all things English, questioning the compatibility of a
Eurocentrically cringing import culture with efforts to “carry Republicanism into literature” (Newcomb 1996, 94). These arguments influenced domestic and foreign policies alike. When the first international copyright treaties were being negotiated on the
European continent in the nineteenth century, the United States refused to protect foreign literary works—a belligerent stance that it would denounce today as piratical. But
back then, the country was a net importer of books, seeking to develop a national literary patrimony of its own. Washington was not interested in extending protection to
international works that might hinder its own printers, publishers, or authors.
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
96
Television & New Media 16(1)
Graeme avows that media studies is simultaneously and understandably more vocational than many other subjects, due to its commitment to production skills and newsand-current affairs research; more populist, given its legitimization of the everyday
and success with students; and more politicized, because in some traditions, it has
been influenced by leftists and feminists (Turner 2007). But this is no uncritical welcome. For instance, he queries a recent fad, creative-industries discourse, as “an industry training program” (Turner 2012) that may help perpetuate stratified labor markets
in the production of culture. That kind of synoptic overview is something both men are
capable of providing, in generous yet astringent ways (see, for example, Newcomb
1986, 2000; Turner 2012).
What of the newer media, as opposed to the venerable and middle-aged ones that
made their names? Sometimes, Horace (2009, 117) seems to lament the passing of
time:
“My” television is gone. It began to disappear (disintegrate? Dissolve? Die?) in the early
1980s, but I didn’t notice. I was too busy figuring out what had intrigued me for so long
(and what became a career [job security? identity? burden?])
But he also knows that we are not at the end of the line. Not nearly (Newcomb 2014;
also see Tay and Turner 2010).
Both Horace and Graeme acknowledge that emergent media have historically supplanted their predecessors as sources of authority and pleasure: literature versus oratory, film versus theater, radio versus sheet music. TV blended all of them. A warehouse
of contemporary culture, it merged what had come before, and is now merging with
personal computers (which were modeled on it) to do the same (Newcomb 2005, 110).
Horace recognizes that “the future of television will be essentially the same as its past”
via “strategies of adjustment” (Newcomb 2014).
Jinna Tay and Graeme Turner (2010, 32) have coined the terms “broadcast pessimism” and “digital optimism” to encapsulate two differing positions on the medium’s
future. Proponents of broadcast pessimism argue that we are witnessing the inexorable
obsolescence of traditional TV—the television of family and peer togetherness—
under the impact of media digitization and mobility. Digital optimists, by contrast,
welcome this shiny new epoch, because its texts and technologies give audiences
unconstrained choice and control.
But as Graeme explains in a recent coauthored book, the reality remains that conventional TV is alive and well in most countries around the world, and holds a central,
even dominant cultural position. It “seems designed, no matter what its platform of
delivery, to generate new ways of being-together-while-apart” (Pertierra and Turner
2013, 66). As ever, television represents a space beyond the worlds of work, school,
and family while offering a forum for ideas that can challenge those very institutions
(Newcomb and Hirsch 1983).
No wonder I find these guys tendentious and thorough! As when I read their work
for the first time in the 1980s, revisiting it en bloc for this wee essay confirmed their
shared blend of accessibility and originality. It is a model for us all.
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
97
Miller
Declaration of Conflicting Interests
The author declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.
Funding
The author received no financial support for the research, authorship, and/or publication of this
article.
References
Foucault, Michel. 1985. The Uses of Pleasure: The History of Sexuality, Volume 2. Translated
by Robert Hurley. Harmondsworth: Penguin.
Newcomb, Horace. 1974. TV: The Most Popular Art. Garden City: Anchor Press/Doubleday.
Newcomb, Horace. 1986. “American Television Criticism, 1970-1985.” Critical Studies in
Mass Communication 3 (2): 217–28.
Newcomb, Horace. 1996. “Other People’s Fictions: Cultural Appropriation, Cultural Integrity,
and International Media Strategies.” In Mass Media and Free Trade: NAFTA and the
Cultural Industries, edited by Emile G. McAnany and Kenton T. Wilkinson, 92–109.
Austin: University of Texas Press.
Newcomb, Horace. 2000. “Searching for Landmarks at the Forefront of Media Research.”
Nordicom Review 2:15–21.
Newcomb, Horace. 2005. “Studying Television: Same Questions, Different Contexts.” Cinema
Journal 45 (1): 107–11.
Newcomb, Horace. 2009. “My Media Studies = My TV” Television & New Media 10 (1):
117–18.
Newcomb, Horace. 2014. “The More Things Change . . . ” Flow 19 (5) http://flowtv.org/2014/01/
the-more-things-change/.
Newcomb, Horace, and Paul M. Hirsch. 1983. “Television as a Cultural Forum: Implications for
Research.” Quarterly Review of Film Studies 8 (3): 45–55.
Pertierra, Anna Cristina, and Graeme Turner. 2013. Locating Television: Zones of Consumption.
London: Routledge.
Tay, Jinna, and Graeme Turner. 2010. “Not the Apocalypse: Television Future in the Digital
Age.” International Journal of Digital Television 1 (1): 31–50.
Turner, Graeme. 1993. National Fictions: Literature, Film and the Construction of Australian
Narrative. 2nd ed. Sydney: Allen & Unwin.
Turner, Graeme. 2007. “Another Way of Looking at It.” Australian, May 30. http://
www.theaustralian.com.au/higher-education/another-way-of-looking-at-it/storye6frgcjx-1111113636658.
Turner, Graeme. 2012. What’s Become of Cultural Studies? London: Sage.
Author Biography
Toby Miller is the Sir Walter Murdoch Professor of Cultural Policy Studies at Murdoch
University and professor of journalism, media, and cultural studies at Cardiff University/
Prifysgol Caerdydd. His adventures can be scrutinized at www.tobymiller.org.
Downloaded from tvn.sagepub.com at UNIV CALIFORNIA SAN DIEGO on December 27, 2014
Download