forthcoming books art, architecture, writings by artists, cultural studies

advertisement
FORTHCOMING BOOKS
January 2013
ART, ARCHITECTURE, WRITINGS BY ARTISTS, CULTURAL STUDIES
A TOPOLOGY OF EVERYDAY CONSTELLATIONS by Georges Teyssot
Today, spaces no longer represent a bourgeois haven; nor are they
the sites of a classical harmony between work and leisure, private
and public, the local and the global. The house is not merely a home
but a site for negotiations with multiple spheres—the technological
as well as the physical and the psychological. In A Typology of
Everyday Constellations, Georges Teyssot considers the intrusion of
the public sphere into private space, and the blurring of notions of
interior, privacy, and intimacy in our societies. Doing so, he suggests
a rethinking of design in terms of a new definition of the practices
of everyday life. Teyssot considers the door, the window, the mirror,
and the screen as the thresholds, interstitial spaces that divide the
world in two: the outside, and the inside. Thresholds, he suggests, work both as markers
of boundaries and as bridges to the exterior. The stark choice between boundary and
bridge creates a middle space, an in-between that holds the possibility of exchanges and
encounters. If the threshold no longer separates public from private, and if we can no
longer think of the house as a bastion of privacy, Teyssot asks, does the body still inhabit
the house—or does the house, evolving into a series of microdevices, inhabit the body?
In addition to a general audience interested in architecture, architectural historians
and theorists and students in these disciplines will welcome this addition to the Writing
Architecture series.
Hardcover. 312 pages; 89 b&w illustrations. February 2013.
NO MEDIUM by Craig Dworkin
What do blank pages, erased texts, white canvases, unprinted pages, silent music,
immobile performers, and clear film have to do with the history of art and literature?
A lot. No Medium tells a story not just about the development of modern art and
literature through such events, but also about the meaning of these, at first appearance,
content-less events. In this book, poet and critic Craig Dworkin considers a number of
works—including a fictional collection of poetry in Jean Cocteau’s Orphée, Nick Thurston’s
erased copy of Maurice Blanchot’s The Space of Literature that left only Thurston’s reset
marginalia, Robert Rauschenberg’s White Paintings and John Cage’s Music for Piano—that
argue there is no “medium”, and that no single medium can be apprehended in isolation.
Media (always necessarily multiple) only become legible in social contexts because they
are not things, but rather activities: commercial, communicative, and interpretive. We
are misled when we think of “media” as objects—as blank, base things—rather than social
events. Each chapter of No Medium focuses on ostensibly “blank” works. Taken together,
they argue for a new understanding of media themselves. There is no medium—there
are only ever media: interpretive activities taking place in socially inscribed places.
Taking as its subject some of the most radical and seemingly illegible experiments in 20th
century artistic practice, this book reads them with eloquence and invests them with
meaning. Scholars and students in art history, media studies, literary theory, music, and
readers interested in the Avant Garde across the arts make up the core audience for this
accessible book.
Hardcover. 224 pages. February 2013.
THE FOURTH DIMENSION AND NON-EUCLIDEAN GEOMETRY IN MODERN
ART, revised edition, by Linda Dalrymple Henderson
First published in 1983 by Princeton University Press, this welcome
reprint will include a new introduction from the author outlining
the further research she has conducted in this area from the
book’s first publication. The Fourth Dimension and Non-Euclidean
Geometry in Modern Art is an examination of the Zeitgeist of the 20th
century, when art and science could and did influence one another.
Artists like Marcel Duchamp, László Moholy-Nagy, El Lissitzky, etc.,
explored what the fourth-dimension might be, from studies of work
on Albert Einstein’s theories of time and relativity to more visual
explorations of planes beyond the conventionally conceived three dimensions that we
live in. A quote from Tony Robbins, cited in the first edition of the book: “Artists who
are interested in four-dimensional space are not motivated by desire to illustrate new
physical theories, nor by a desire to solve mathematical problems. We are motivated
by a desire to complete our subjective experience by inventing new aesthetic and
conceptual capabilities. We are not in the least surprised, however, to find physicists and
mathematicians working simultaneously on a metaphor for space in which paradoxical
three dimensional experiences are resolved only by a four dimensional space. Our reading
of the history culture has shown us that in the development of new metaphors for space
artists, physicists and mathematicians are usually in step.” This interest is still current,
with new materials and computational technologies helping to care new terrains. Readers
on contemporary art and new media will welcome this new edition, in the Leonardo
series.
Hardcover. 740 pages; 140 b&w illustrations. February 2013.
HÉLIO OITICICA AND NEVILLE D’ALMEIDA: Block-Experiments in
Cosmococa by Sabeth Buchmann and Max Jorge Hinderer Cruz
Hélio Oiticica (1937–1980) occupies a central position in the Latin
American avant-garde of the postwar era. Associated with the Rio de
Janeiro-based neo-concretist movement at the beginning of his career,
Oiticica moved from object production to the creation of chromatically
opulent and sensually engulfing large-scale installations or wearable
garments. Building on the idea for a film by Brazilian underground
filmmaker Neville D’Almeida, Oiticica developed the concept for
Block-Experiments in Cosmococa—Program in Progress (1973–1974) as
an “open program”: a series of nine proposals for environments, each
consisting of slide projections, soundtracks, leisure facilities, drawings (with cocaine
used as pigment), and instructions for visitors. It is the epitome of what the artist
called his “quasi-cinema” work—his most controversial production, and perhaps his most
direct effort to merge art and life. Presented publicly for the first time in 1992, these
works have been included in major international exhibitions in Los Angeles, Chicago,
London, and New York. Drawing on unpublished primary sources, letters, and writings
by Oiticica himself, the authors discuss Oiticica’s work in relation to the diaspora of
Brazilian intellectuals during the military dictatorship, the politics of media circulation,
the commercialization of New York’s queer underground, the explicit use of cocaine as
means of production, and possible future reappraisals of Oiticica’s work. Readers on
contemporary art generally, and those specifically interested in the Latin American AvantGarde make up the core audience for this contribution to the Afterall Onework series.
Hardcover. 130 pages; 32 b&w illustrations. March 2013.
WHAT WAS CONTEMPORARY ART? by Richard Meyer
Contemporary art in the early twenty-first century is often discussed
as though it were a radically new phenomenon unmoored from
history. Yet all works of art were once contemporary to the artist and
culture that produced them. In What Was Contemporary Art? Richard
Meyer reclaims the contemporary from historical amnesia, exploring
episodes in the study, exhibition, and reception of early twentiethcentury art and visual culture. Meyer analyzes an undergraduate
course in taught by Alfred Barr at Wellesley College in 1927 as a
key moment in the introduction of works by living artists into the
discipline of art history, then turns to a series of exhibitions from the 1930s that that
put contemporary art in dialogue with pre-modern works ranging from prehistoric
cave pictures to Italian Renaissance paintings. Meyer also treats the controversy that
arose in 1948 over the decision by Boston’s Institute of Modern art to change its name
to the Institute of Contemporary Art. By retrieving moments in the history of oncecurrent art, Meyer redefines “the contemporary” as a condition of being alive to and
alongside other moments, artists, and objects. A generous selection of images, many
in color—from works of fine art to museum brochures and magazine covers—support and
extend Meyer’s narrative. These works were contemporary to their own moment. Now,
in Meyer’s account, they become contemporary to ours as well. Scholars and students of
contemporary art and general art readers will welcome this book.
Hardcover. 256 pages; 36 color illustrations; 81 b&w illustrations. March 2013.
ADHOCISM: The Case for Improvisation, updated and expanded
edition, by Charles Jencks and Nathan Silver
When this book first appeared in 1972, it was part of the Zeitgeist that
would define a new architecture and design era—a new way of thinking
that challenged the purist doctrines and formal models of modernism.
Adhocism has always been around. (Think Robinson Crusoe, making
a raft and then a shelter from the wreck of his ship). As a design
principle, adhocism starts with everyday improvisations: a bottle as a
candleholder, a dictionary as a doorstop, a tractor seat on wheels as
a dining room chair. But it also changes the way we approach almost
every activity, from play to architecture to city planning to political
revolution. Charles Jencks and Nathan Silver’s book was a manifesto
for a generation that took pleasure in doing things ad hoc, using materials at hand to
solve real-world problems. The implications were subversive. Turned-off citizens of the
1970s immediately adopted the book as a DIY guide. The word “adhocism” entered the
vocabulary, the concept of adhocism became part of the designer’s toolkit, and Adhocism
became a cult classic. Now Adhocism is available again, with new texts by Jencks and
Silver reflecting on the past forty years of adhocism and new illustrations demonstrating
adhocism’s continuing relevance. Engagingly written, filled with pictures and examples
from areas as diverse as auto mechanics and biology, Adhocism urges us to pay less
attention to the rulebook and more to the way we actually do things. It declares that
problems are not necessarily solved in a genius’s “eureka!” moment but by trial and error,
adjustment and readjustment. In addition to an audience of designers and architects,
general readers will welcome this expanded edition of Adhocism.
Paperback. 256 pages; 244 b&w illustrations. April 2013.
CRITICAL LABORATORY: The Writings of Thomas Hirschhorn. Edited by Lida Lee and Hal Foster.
For the artist Thomas Hirschhorn, writing is a crucial tool at every stage of his artistic
practice. From the first sketch of an idea to appeals to potential collaborators, from
detailed documentation of projects to post-disassembly analysis, Hirschhorn’s writings
mark the trajectories of his work. This volume collects Hirschhorn’s widely scattered
texts, presenting many in English for the first time. In these writings, Hirschhorn discusses
the full range of his art, from works on paper to the massive Presence and Production
projects in public spaces. “Statements and Letters” address broad themes of aesthetic
philosophy, politics, and art historical commitments. “Projects” consider specific artworks
or exhibitions. “Interviews” capture the artist in dialogue with Benjamin Buchloh,
Jacques Rancière, and others. Throughout, certain continuities emerge: Hirschhorn’s
commitment to quotidian materials; the centrality of political and economic thinking
in his work; and his commitment to art in the public sphere. Taken together, the texts
serve to trace the artist’s ideas and artistic strategies over the past two decades. Critical
Laboratory also reproduces, in color, 33 Ausstellungen im öffentlichen Raum 1998–1989,
an out-of-print catalog of Hirschhorn’s earliest works in public space. Readers on
contemporary art will welcome Critical Laboratory.
Hardcover. 354 pages; 48 color illustrtions; 63 b&w illustrations. July 2013.
CONSTRUCTING AN AVANT-GARDE: Art in Brazil, 1949-1979 by Sérgio B. Martins
Brazilian avant-garde artists of the postwar era worked from a fundamental but
productive out-of-jointness, modernist but distant from modernism. European and North
Americans may feel a similar displacement when viewing Brazilian avant-garde art; the
unexpected familiarity of the works serves to make them unfamiliar. In Constructing an
Avant-Garde, Sérgio Martins seizes on this uncanny obliqueness and uses it as the basis
for a reconfigured account of the history of Brazil’s avant-garde. His discussion covers
not only widely renowned artists and groups—including Hélio Oiticica, Lygia Clark, and
neoconcretism—but also important artists and critics who are less well known outside
Brazil, including Mário Pedrosa, Ferreira Gullar, Luis Sacilotto, and Rubens Gerchman.
Martins argues that artists of Brazil’s postwar avant-garde updated modernism in a way
that was radically at odds with European and North American art historical narratives.
He describes defining episodes in Brazil’s postwar avant-garde, discussing crucial critical
texts, including Gullar’s “Theory of the Non-Object,” a phenomenological account
of neoconcrete artworks; Oiticica, constructivity, and Mondrian; portraiture, selfportraiture, and identity; the nonvisual turn and missed encounters with conceptualism;
and monochrome, manifestos, and engagement. The Brazilian avant-garde’s hijacking
of modernism, Martins shows, gained further complexity as artists began to face their
international minimalist and conceptualist contemporaries in the 1960s and 1970s.
Readers on art, particularly post-war art, art historians, and students make up the core
audience for Constructing an Avant-Garde.
Hardcover. 248 pages; 15 color illustrations; 60 b&w illustrations. MIT Press holds all
rights with the exception of Portuguese language rights. September 2013.
RODNEY GRAHAM: The Phonokinetoskpe by Shep Steiner
Coming out of Vancouver’s 1970s photoconceptual tradition, Rodney Graham’s work is
often informed by historical literary, musical, philosophical and popular references.
The Phonokinetoscope (2001) is a five-minutes 16mm film loop of the artist riding his
Fischer Original bicycle through Berlin’s Tiergarten while taking LSD and is accompanied
by a 15-minutes song recorded on a vinyl LP. The turntable drives the projection of the
film so that the placement of the needle on the record, and its displacement, act as an
on-off switch for the looped projection. As a result, the moving image and sound share
an asynchronous relationship unhinging the work from a simple correlation to cinema
or music videos. During Graham’s pastoral cycle ride through the Tiergarten we are
presented with a series of observations and references: Albert Hoffmann’s unintentional
acid trip while riding a bicycle home from his laboratory one evening; Paul Newman riding
backwards on a bicycle in Butch Cassidy and the Sundance Kid. This tangle of citations,
historical sources and images and the musical thicket of riffs and lyric borrowings
become increasingly complex as the ongoing loop repeats hypnotically before the viewer,
displaying a world rich with subtle meaning and derived as much from pop culture as
from the 18th and 19th century. The Phonokinetoscope represents a pivotal moment
in the artist’s career and is evidence of Graham’s engagement both with the origins of
cinema and its eventual demise. Organized in six chapters, this book offers a snapshot
view of the artist’s transition into a new medium and positions Graham’s practice in
dialogue with the work of Bas Jan Ader, Paul McCarthy, Jack Goldstein, Ian Wallace, and
Jeff Wall. Steiner offers grounds for drawing comparison to Richard Wagner’s notion
of the Gesamtkunstwerk, a notion that strongly relates to Graham’s earliest musical
interventions such as Recital (1995) that hinged upon Wagner. Readers on contemporary
art, students, and followers of Afterall’s One Work series will welcome this title.
Paperback. 120 pages; 32 color illustrations. September 2013.
SNAPSHOT PHOTOGRAPHY: The Lives of Images by Catherine Zuromskis
Snapshots capture everyday occasions. Taken by amateur photographers with simple
point-and-shoot cameras, snapshots often commemorate something that is private
and personal; yet they also reflect widely held cultural conventions. The poses may
be formulaic, but a photograph of loved ones can evoke a deep affective response.
Scholars of art and culture tend to discount snapshot photography; it is too ubiquitous,
too unremarkable, too personal. Zuromskis argues for its significance. Snapshot
photographers, she contends, are not so much creating spontaneous records of their lives
as they are participating in a prescriptive cultural ritual. A snapshot is not only a record
of interpersonal intimacy but also a means of linking private symbols of domestic harmony
to public ideas of social conformity. Through a series of case studies, Zuromskis explores
the social life of snapshot photography in the United States in the latter half of the
twentieth century. She examines the treatment of snapshot photography in the 2002 film
One Hour Photo and in the television crime drama Law and Order: Special Victims Unit;
the growing interest of collectors and museum curators in “vintage” snapshots; and the
“snapshot aesthetic” of Andy Warhol and Nan Goldin. She finds that Warhol’s photographs
of the Factory community and Goldin’s intense and intimate photographs of friends and
family use the conventions of the snapshot to celebrate an alternate version of “family
values.” In today’s digital age, snapshot photography has become even more ubiquitous
and ephemeral—and, significantly, more public. But buried within snapshot photography’s
mythic construction, Zuromskis argues, is a site of democratic possibility. General
readers interested in photography as well scholars of art history and visual studies,
cultural studies, American studies, sociology, anthropology, film and media are among the
audience for this book.
Hardcover. 264 pages; 77 b&w illustrations. September 2013.
WHY PHOTOGRAPHY MATTERS by Jerry L. Thompson
Photography matters, writes Jerry Thompson, because of how it works—not only as an
artistic medium but also as a way of knowing. It matters because how we understand
what photography is and how it works tell us something about how we understand
anything. With these provocative observations, Thompson begins a wide-ranging and lucid
meditation on why photography is unique among the picture-making arts. Thompson, a
working photographer for forty years, constructs an argument that moves with natural
logic from Thomas Pynchon (and why we read him for his vision and not his command of
miscellaneous facts) to Jonathan Swift to Plato to Emily Dickinson (who wrote “Tell all
the Truth but tell it slant”) to detailed readings of photographs by Eugéne Atget, Garry
Winograd, Marcia Due, Walker Evans, and Robert Frank. He questions Susan Sontag’s
assertion in On Photography that “nobody” can any longer imagine literate, authoritative,
or transcendent photographs. He considers the money-fueled expansion of the market
for photography, and he compares ambitious “meant-for-the-wall” photographs with
smaller, quieter works. Forcefully and persuasively, Thompson argues for photography
as a medium concerned with understanding the world we live in—a medium whose
business is not constructing fantasies pleasing to the eye or imagination but describing
the world in the toughest and deepest way. Students and practitioners of photography,
and general readers interested in the arts and culture make up the core audience for Why
Photography Matters.
Hardcover. 112 pages; 7 b&w illustrations. September 2013.
ARCHITECT? A Candid Guide to the Profession, third edition, by Roger K. Lewis
Since 1985, Architect? has been an essential text for aspiring architects, offering the best
basic guide to the profession available. This third edition has been substantially revised
and rewritten, with new material covering the latest developments in architectural
and construction technologies, digital methodologies, new areas of focus in teaching
and practice, evolving aesthetic philosophies, sustainability and green architecture,
and alternatives to traditional practice. Chapter 1 asks “Why Be an Architect?” and
chapter 2 offers reasons “Why Not to Be an Architect.” After this provocative beginning,
Architect? goes on to explain and critique architectural education, covering admission,
degree and curriculum types, and workload as well as such post-degree options as
internship, teaching, and work in related fields. It offers a detailed discussion of
professors and practitioners and the “-isms and “-ologies” most prevalent in teaching
and practicing architecture. It explains how an architect works and gets work, and
describes architectural services from initial client contact to construction oversight.
The new edition also includes a generous selection of drawings and cartoons from the
author’s Washington Post column, “Shaping the City,” offering teachable moments wittily
in graphic form. In Architect? Lewis explains—for students, professors, practitioners, and
even prospective clients—how architects think and work and what they care about as
they strive to make the built environment more commodious, more beautiful, and more
sustainable.
Paperback. 248 pages; 94 b&w illustrations. October 2013.
YOUR EVERYDAY ART WORLD by Lane Relya
Over the past twenty years, the network has come to dominate the art world, affecting
not just interaction among art professionals but the very makeup of the art object
itself. The hierarchical and restrictive structure of the museum has been replaced by
temporary projects scattered across the globe, staffed by free agents hired on short-term
contracts, viewed by spectators defined by their predisposition to participate and make
connections. In this book, Lane Relyea tries to make sense of these changes, describing a
general organizational shift in the art world that affects not only material infrastructures
but also conceptual categories and the construction of meaning. Examining art
practice, exhibition strategies, art criticism, and graduate education, Relyea links the
transformation of the art world to globalization and the neoliberal economy. He connects
the new networked, participatory art world—hailed by some as inherently democratic—
with the pressures of part-time temp work in a service economy, the calculated
stockpiling of business contacts, and the anxious duty of being a “team player” at work.
Relyea calls attention to certain networked forms of art—including relational aesthetics,
multiple or fictive artist identities, and bricolaged objects—that can be seen to oppose
the values of neoliberalism rather than romanticizing and idealizing them. Relyea offers a
powerful answer to the claim that the interlocking functions of the network—each act of
communicating, of connecting, or practice--are without political content. The audience
for Your Everyday Art World will include artists and students of art history, as well as
readers on contemporary culture and art.
Hardcover. 212 pages; 36 b&w illustrations. October 2013.
SCIENCE, TECHNOLOGY, AND SOCIETY, HISTORY OF SCIENCE AND COMPUTING,
INFORMATION SCIENCE, INNOVATION
TECHNOLOGIES OF CHOICE? ICTs, Development, and the Capabilities
Approach by Dorothea Kleine
Information and communication technologies (ICTs)—especially the
Internet and the mobile phone—have changed the lives of people all
over the world. These changes affect not just the affluent populations
of income-rich countries but also disadvantaged people in both global
North and South, who may use free Internet access in telecenters and
public libraries, chat in cybercafes with distant family members, and
receive information by text message or email on their mobile phones.
Drawing on Amartya Sen’s capabilities approach to development—which
shifts the focus from economic growth to a more holistic, freedombased idea of human development—Dorothea Kleine in Technologies
of Choice? examines the relationship between ICTs, choice, and development. She
proposes a conceptual framework, the Choice Framework, that can be used to analyze
the role of technologies in development processes. She applies the Choice Framework
to a case study of micro-entrepreneurs in a rural community in Chile. Kleine combines
ethnographic research at the local level with interviews with national policymakers, to
contrast the high ambitions of Chile’s pioneering ICT policies with the country’s complex
social and economic realities. She examines three key policies of Chile’s groundbreaking
Agenda Digital: public access, digital literacy, and an online procurement system. The
policy lesson we can learn from Chile’s experience, Kleine concludes, is the necessity of
measuring ICT policies against a people-centered understanding of development that has
individual and collective choice at its heart. Techologies of Choice? will be of particular
interest to readers on Information Studies, science, technology, and society, and Latin
American studies. In the Information Society Series.
Hardcover. 264 pages; 34 b&w illustrations. February 2013.
THE VIEW FROM ABOVE: The Science of Social Space by Jeanne Haffner
In mid-twentieth century France, the term “social space” (l’espace social)—the idea
that spatial form and social life are inextricably linked—emerged in a variety of social
science disciplines. Taken up in the 1960s by the French New Left, it also came to
inform the practice of urban planning. In The View from Above, historian Jeanne
Haffner traces the evolution of the science of social space from the interwar period to
the 1970s, illuminating in particular the role of aerial photography in this new way of
conceptualizing socio-spatial relations. As early as the 1930s, the view from above served
as a critical means to connect the social and the spatial for anthropologists such as Marcel
Griaule. Just a few decades later, the Marxist urban sociologist Henri Lefebvre called the
perspective enabled through aerial photography— a technique closely associated with
the French colonial state and military—“the space of state control.” Lefebvre and others
nevertheless used the notion of “social space” to recast the problem of massive modernist
housing projects (grands ensembles) to encompass the modern suburb (banlieue) itself—a
critique that has contemporary resonance in light of the banlieue riots of 2005 and 2007.
Haffner shows how such “views” permitted new ways of conceptualizing the old problem
of housing to emerge. She also points to broader issues, including the influence of the
colonies on the metropole, the application of sociological expertise to the study of the
built environment, and the development of a spatially oriented critique of capitalism.
The View from Above will be of particular interest to students and practitioners of urban
studies and planning, visual studies, STS, and history of public policy.
Hardcover. 256 pages; 26 b&w photos. March 2013.
ARGUMENTS THAT COUNT: Physics, Computing, and Missile Defense, 1949-2012 by Rebecca
Slayton
More than fifty years ago, real-time computers developed to help defend the United
States against a missile attack were regarded as the ultimate in speed and reliability,
human-free and infallible. Elite presidential scientific advisers, most of them physicists,
made a politically persuasive case for computerized air defense while neglecting the
risk of catastrophic computer failure. Non-elite technologists, meanwhile, charged with
managing the most complex computer programs ever developed, began to recognize the
risks inherent in these systems. In Arguments that Count, Rebecca Slayton compares the
missile defense analyses of the dominant experts—physicists—with those of computer
scientists. Doing so, she shows how different kinds of experts assess the promise and risk
of technology and illuminates the messy practice of constructing persuasive arguments
about technology. Slayton shows that the risk of complex software failure, brushed aside
in early debates over missile defense, began to be addressed as computer scientists
developed what she calls a disciplinary repertoire—a set of quantitative rules and codified
knowledge—that enabled them to construct more authoritative arguments about missile
defense. In the 1980s, with President Ronald Reagan’s proposal for the Strategic Defense
Initiative (SDI, also known as “Star Wars”), the feasibility of computerized missile defense
systems became the subject of nationwide debate. Computer scientists had learned
how to make their arguments count in the political process. The primary audience for
this book is made up of scholars and students of Science, Technology, and Society and of
security studies.
Hardcover. 272 pages. August 2013.
GIRLS COMING TO TECH! A History of American Engineering Education for Women by Amy Sue
Bix
Engineering education in the United States was long regarded as exclusively male
territory. Women who studied or worked in engineering were popularly perceived
as oddities, outcasts, unfeminine (or inappropriately feminine in a male world). In
Girls Coming to Tech!, Amy Bix tells the story of how women gained entrance to the
traditionally male field of engineering in American higher education. Bix explains that
very few women breached the gender-reinforced boundaries of engineering education
before World War II. During the war, however, women were actively recruited, trained
as engineering aides, and channeled directly into defense work. These wartime training
programs, although designed to be temporary, demonstrated that women could handle
technical subjects, and a few engineering programs opened their doors to women. The
author offers three detailed case studies of postwar engineering coeducation: at Georgia
Tech, where women were admitted in 1952 to avoid a court case; at Caltech in 1968,
where male students pushed for coeducation, arguing for women’s civilizing influence;
and at MIT, where women had been admitted since the 1870s but where women’s
education was considered a minor afterthought. In the 1950s, women made up less than
one percent of students in American engineering programs; in 2010 and 2011, women
earned 18.4% of bachelor’s degrees, 22.6% of master degrees, and 21.8% of doctorates in
engineering. Students and scholars in the History of Technology and Science, students of
the history of American education and engineering, and lay readers on women’s studies
will be especially interested in this book.
Hardcover. 328 pages. September 2013.
MAGNETIC MOMENTS: The Life of Paul Lauterbur, Inventor of MRI by M. Joan Dawson
On September 2, 1971, the chemist Paul Lauterbur had an idea that would change the
practice of medical research. Considering recent research findings about the use of
nuclear magnetic resonance (NMR) signals to detect tumors in tissue samples, Lauterbur
realized that the information from NMR signals could be recovered in the form of images—
and thus obtained noninvasively from a living subject. It was an unexpected epiphany:
he was eating a hamburger at the time. Lauterbur rushed out to buy a notebook in which
to work out his idea; he completed his notes a few days later. He had discovered the
basic method used in all MRI scanners around the world, and for this discovery he would
share the Nobel Prize for Physiology or Medicine in 2003. This book, by Lauterbur’s wife
and scientific partner, M. Joan Dawson, is the story of Paul Lauterbur’s discovery and the
subsequent development of the most important medical diagnostic tool since the X-ray.
With MRI, Lauterbur had discovered an entirely new principle of imaging. Dawson explains
the science behind the discovery and describes Lauterbur’s development of the idea, his
steadfastness in the face of widespread skepticism and criticism, and related work by
other scientists including Pater Mansfield (Lauterbur’s Nobel co-recipient), and Raymond
Damadian (who famously feuded with Lauterbur over credit for the ideas behind MRI). She
offers not only the story of one man’s passion for his work but also a case study of how
science is actually done: a flash of insight followed by years of painstaking work. General
science readers as well as readers on the history of science, technology, and medicine
make up the core audience for this biography.
Hardcover. 256 pages; 42 b&w illustrations. September 2013.
THE NEWS GAP: When the Supply and Demand of Information Do Not Meet by Pablo J.
Boczkowski and Eugenia Mitchelstein
The news that journalists who work at mainstream online news sties consider the most
important and the stories that attract the most attention among the public for these site
are different. This book examines the gap between them by assessing the magnitude of
this gap and the factors that shape it, and reflects on what this means for the economic
viability of these news organizations and the quality of democratic life in the digital
age. The News Gap is unique in terms of subject matter and methodology—the empirical
studies are based on an innovative research design that relies on primary data to examine
the concurrent news choices of journalists and consumers. It draws on the analysis of
almost 40,000 stories collected by the authors’ research team and examines twenty
new sites based in seven different countries from North and South America and Western
Europe. Its findings are surprising: for instance, ideology and geography do not affect
the existence and size of the gap, and much touted news formats such as blogs and
user-generated content have very limited appeal among the public. The authors offer a
thorough discussion of what these findings mean for the future of media and democracy.
Scholars and students in media studies, journalism, communications, and public affairs
are among the core readers for The News Gap.
Hardcover. 256 pages. September 2013. MIT Press holds all rights with the exception of
Spanish language rights.
EMIL DU BOIS-REYMOND: Neuroscience, Self, and Society in Nineteenth-Century Germany by
Gabriel Finkelstein
Emil du Bois-Reymond is the most important forgotten intellectual of the nineteenth
century. Du Bois-Reymond (1818–1896) was famous in his native Germany and beyond
for his research in neuroscience and his influential and provocative addresses on
science and culture. Gabriel Finkelstein draws on du Bois-Reymond’s personal papers,
published writings, and the responses of contemporaries. His discovery of the electrical
transmission of nerve signals, his linking of structure to function in neural tissue, and
his theory that neural connections improved with use all helped lay the foundations for
modern neuroscience. Du Bois-Reymond’s public lectures, covering topics in science,
philosophy, history, and literature, made him a celebrity. In these widely ranging talks,
du Bois-Reymond introduced Darwin to German students (triggering two days of debate
in the Prussian parliament) and proclaimed the mystery of consciousness, heralding the
age of doubt. With this book, the first modern biography of du Bois-Reymond, Finkelstein
recovers an important chapter in the history of science. Scholars and students of the
history of science, the history of neuroscience, and German and European intellectual
history will be especially interested in this volume.
Hardcover. 360 pages; 15 b&w illustrations. October 2013.
MONITORING MOVEMENTS IN DEVELOPMENT AID: Recursive Partnerships and Infrastructure by
Casper Bruun Jensen and Brit Ross Winthereik
In Monitoring Movements in Development Aid, Casper Jensen and Brit Winthereik
consider the processes, social practices, and infrastructures that are emerging to monitor
development aid, discussing both empirical phenomena and their methodological and
analytical challenges. Jensen and Winthereik focus on efforts by aid organizations
to make better use of information technology; they analyze a range of development
aid information infrastructures created to increase accountability and effectiveness.
They find that constructing these infrastructures is not simply a matter of designing
and implementing technology but entails forging new platforms for action that are
simultaneously imaginative and practical, conceptual and technical. After presenting an
analytical platform that draws on science and technology studies and the anthropology
of development, Jensen and Winthereik present an ethnographically based analysis of
the mutually defining relationship between aid partnerships and infrastructures. Among
the topics addressed are the crucial role of users (both actual and envisioned) in aid
information infrastructures; efforts to make aid information dynamic and accessible;
existing monitoring activities of an environmental NGO; and national-level performance
audits, which encompass concerns of both external control and organizational learning.
Jensen and Winthereik argue that central to the emerging movement to monitor
development aid is the blurring of means and ends: aid information infrastructures are
both technological platforms for knowledge about aid and forms of aid and empowerment
in their own right. Scholars and students of Science, Technology, and Society and readers
on information science and development studies in particular will welcome this addition
to the series.
Hardcover. 224 pages. October 2013.
NETWORKNG PERIPHERIES: Digital Universalism and Technological Futures Beyond Centers by
Anita Say Chan
In Networking Peripheries, Anita Chan shows that digital cultures exist beyond Silicon
Valley and other famous centers of technological innovation and entrepreneurship. The
developing digital cultures in the Global South demonstrate vividly that there are more
ways than one to imagine what digital practice and connection could look like. To explore
these alternative visions of technological futures, Chan investigates Peru’s diverse digital
engagements, from attempts to protect the intellectual property of indigenous artisans
to open technology activism to digital education initiatives. Peru’s recent economic
growth has helped to expand an active consumer class in Lima and other cities and to
fuel the proliferation of electronic devices nationwide. In rural areas, cell phones are
more common than landlines; networked life is experienced not only by urban elites.
Drawing on accounts from government planners, regional information activists, traditional
artisans, rural educators, and others, Chan describes a series of Peruvians’ interactions
with digital technologies, including government efforts to turn rural artisans into a new
creative class; proposals for state-wide adoption of open source-based technologies;
the translation of software into indigenous languages; and the One Laptop Per Child’s
distribution of simple laptop computers to rural schoolchildren. As these cases show,
the digital cultures and network politics that are emerging on the periphery do not
necessarily replicate the universalized technological future imagined in the center.
Students of Science, Technology, and Society and readers on innovation and development
studies, communication, and Latin American studies make up the core audience for
Networking Peripheries.
Hardcover. 248 pages. October 2013.
NEW MEDIA STUDIES, NEW MEDIA ART, HUMAN-COMPUTER INTERACTION, DIGITAL
HUMANITIES, INFORMATION DESIGN, ENGINEERING SYSTEMS
REGULATING CODE: Good Governance and Better Regulation in the Information Age by Ian
Brown and Christopher T. Marsden
Governments and their regulatory agencies have struggled to keep up with the rapidly
changing technologies and uses of the Internet. In Regulating Code, a regulatory lawyer
and a computer scientist combine their perspectives to analyze the regulatory shaping
of “code”—the technological environment of the Internet—to achieve more economically
efficient and socially just regulation. They examine five “hard cases” that illustrate the
state exercise of regulatory power (or its forbearance from exercising that power) in this
new domain: privacy and data protection, copyright and creativity incentives, censorship,
social networks and user-generated content, and net neutrality. The authors describe
a multistakeholder environment for Internet governance, in which user groups have a
place alongside business and government, but caution that the regulatory goals of users,
large information companies, and governments (whether elected or authoritarian) are
not necessarily congruent. They emphasize that the Internet’s interoperability is both
an innovative strength and an inherent security weakness, and draw lessons from the
regulatory and interoperability failures illustrated by the five cases. Finally, they propose
what they term a “prosumer law” approach designed to enhance the production of public
goods, including the protection of fundamental democratic rights. Readers in information
policy, law, and Internet studies will be especially interested in Regulating Code.
Hardcover. 288 pages. February 2013.
THE ART OF FAILURE: An Essay on the Pain of Playing Video Games by
Jasper Juul
Every day, hundreds of millions of people around the world play video
games—on smart phones, on computers, on consoles—and most of
them will experience failure at some point in the game; they will
lose, die, or fail to advance to the next level. Humans may have a
fundamental desire to succeed and feel competent, but game players
choose to engage in an activity in which they are nearly certain to fail
and feel incompetent. In The Art of Failure, Jesper Juul examines this
paradox. In video games, as in tragic works of art, literature, theater,
and cinema, it seems that we want to experience unpleasantness
even if we also dislike it. Reader or audience reaction to tragedy is
often explained as catharsis, as a purging of negative emotions. But, Juul points out, this
doesn’t seem to be the case for video game players. Games do not purge us of unpleasant
emotions; they produce them in the first place. What, then, does failure in video game
playing do? Juul argues that failure in a game is unique in that when you fail in a game,
you (not a character) are in some way inadequate. Yet games also motivate us to play
more, in order to escape that inadequacy, and the feeling of escaping failure (often by
improving skills) is a central enjoyment of games. Games, writes Juul, are the art of
failure: the singular art form that sets us up for failure and allows us to experience and
experiment with it. The Art of Failure is essential reading for anyone interested in video
games, whether as entertainment, art, or education. In the Playful Thinking series.
Hardcover. 168 pages; 54 b&w illustrations. February 2013.
AMBIENT COMMONS: Attention in the Age of Embodied Information by
Malcolm McCullough
The world is filling with ever more kinds of media, in ever more
contexts and formats. Glowing rectangles have become part of the
scene; screens, large and small, appear everywhere. Physical locations
are increasingly tagged and digitally augmented. Sensors, processors,
and memory are not found only in chic smart phones but also built
into everyday objects. Amid this flood, your attention practices matter
more than ever. So it is worth remembering that underneath all these
augmentations and data flows, fixed forms persist, and that to notice
them can improve other sensibilities. In Ambient Commons, Malcolm
McCullough explores the workings of attention though a rediscovery of
surroundings. Not all that informs has been written and sent; not all attention involves
deliberate thought. The intrinsic structure of space—the layout of a studio, for example,
or a plaza—becomes part of any mental engagement with it. McCullough describes what
he calls the Ambient: an increasing tendency to perceive information superabundance
whole, where individual signals matter less and at least some mediation assumes
inhabitable form. He explores how the fixed forms of architecture and the city play a
cognitive role in the flow of ambient information. As a persistently inhabited world,
can the Ambient be understood as a shared cultural resource, to be socially curated,
voluntarily limited, and self-governed as if a commons? Ambient Commons invites you
to look past current obsessions with smart phones to rethink attention itself, to care for
more situated, often inescapable forms of information. This monograph will be of interest
to a general educated audience as well as to students and practitioners in computer
human interaction and new media.
Hardcover. 320 pages; 58 b&w illustrations. March 2013.
CONTAGIOUS ARCHITECTURE: Computation, Aesthetics, and Space by Luciana Parisi
In Contagious Architecture, Luciana Parisi offers a philosophical inquiry into the status
of the algorithm in architectural and interaction design. Her thesis is that algorithmic
computation is not simply an abstract mathematical tool but constitutes a mode of
thought in its own right, in that its operation extends into forms of abstraction that lie
beyond direct human cognition and control. These include modes of infinity, contingency,
and indeterminacy, as well as incomputable quantities underlying the iterative process
of algorithmic processing. The main philosophical source for the project is Alfred North
Whitehead, whose process philosophy is specifically designed to provide a vocabulary for
“modes of thought” exhibiting various degrees of autonomy from human agency even as
they are mobilized by it. Because algorithmic processing lies at the heart of the design
practices now reshaping our world—from the physical spaces of our built environment to
the networked spaces of digital culture—the nature of algorithmic thought is a topic of
pressing importance that reexamines questions of control and, ultimately, power. Scholars
and students of new media make up the core audience for Contagious Architecture.
Hardcover. 400 pages; 22 b&w illustrations. March 2013.
UNCERTAINTY IN GAMES by Greg Costikyan
In life, uncertainty surrounds us. Things that we thought were good
for us turn out to be bad for us (and vice versa); people we thought
we knew well behave in mysterious ways; the stock market takes a
nosedive. Thanks to an inexplicable optimism, most of the time we are
fairly cheerful about it all. But we do devote much effort to managing
and ameliorating uncertainty. Is it any wonder, then, asks Greg
Costikyan, that we have taken this aspect of our lives and transformed
it culturally, making a series of elaborate constructs that subject us to
uncertainty but in a fictive and nonthreatening way? That is: we create
games. In this concise and entertaining book, Costikyan, an awardwinning game designer, argues that games require uncertainty to hold
our interest, and that the struggle to master uncertainty is central to their appeal. Game
designers, he suggests, can harness the idea of uncertainty to guide their work. Costikyan
explores the many sources of uncertainty in many sorts of games—from Super Mario
Bros. to Rock/Paper/Scissors, from Monopoly to CityVille, from FPS Deathmatch play to
Chess. He describes types of uncertainty, including performative uncertainty, analytic
complexity, and narrative anticipation. And he suggest ways that game designers who
want to craft novel game experiences can use an understanding of game uncertainty in its
many forms to improve their designs. Readers in game studies ranging from academics to
dedicated game players will welcome this addition to the new Playful Thinking series.
Hardcover. 136 pages. March 2013.
DIGITAL METHODS by Richard Rogers
How can we study social media to learn something about society
rather than about social media use? How can hyperlinks reveal not
just the value of a Web site, but the politics of association? In Digital
Methods, Richard Rogers proposes a methodological outlook for
social and cultural scholarly research on the Web that seeks to move
Internet research beyond the study of online culture. Rogers proposes
repurposing Web-native techniques for research into cultural change
and societal conditions. We can learn to reapply such “methods of
the medium” as crawling and crowd sourcing, PageRank and similar
algorithms, Tag clouds and other visualizations; we can learn how they handle hits, likes,
tags, date stamps, and other Web-native objects. By “thinking along” with devices and
the objects they handle, digital research methods can follow the evolving methods of
the medium. Rogers uses this new methodological outlook to examine the findings of
inquiries into 9/11 search results, the recognition of climate change skeptics by climate
change-related Web sites, the events surrounding the Srebrenica massacre according to
Dutch, Serbian, Bosnian, and Croatian Wikipedias, presidential candidates’ social media
“friends,” and the censorship of the Iranian Web. With Digital Methods, Rogers introduces
a new vision and method for Internet research and at the same time applies them to the
Web’s objects of study, from tiny particles (hyperlinks) to large masses (social media).
Scholars and students of new media, Internet studies, information science, and science,
technology, and society will be the primary audience for Digital Methods.
Hardcover. 280 pages. April 2013.
MOVING INNOVATION: A History of Computer Animation by Tom Sito
Computer graphics (or CG) has changed the way we experience the
art of moving images. Computer graphics are the difference between
Steamboat Willie and Buzz Lightyear, between ping pong and PONG. It
began in 1963 when an MIT graduate student named Ivan Sutherland
created the first true computer animation program. Instead of
presenting a series of numbers, Sutherland’s Sketchpad program drew
lines that created recognizable images. Sutherland noted: “Since
motion can be put into Sketchpad drawings, it might be exciting to try
making cartoons.” This book, the first full-length history of CG, shows
us how Sutherland’s seemingly offhand idea grew into a multi-billion dollar industry. In
Moving Innovation, Tom Sito—himself an animator and industry insider for more than
thirty years—describes the evolution of computer graphics. The history of traditional
cinema technology is a fairly straight path from Lumière to MGM. Writing the history of
CG, Sito maps simultaneous accomplishments in multiple locales—academia, the militaryindustrial complex, movie special effects, video games, experimental film, corporate
research, and commercial animation. His story features a memorable cast of characters—
math nerds, avant-garde-artists, cold warriors, hippies, video game enthusiasts, and
studio executives: disparate types united by a common vision. Computer animation
did not begin just with Pixar; Sito shows us how fifty years of work by this motley crew
made movie like Toy Story and Avatar possible. This accessible book will be of particular
interest to readers on new media generally and in the historical, technical, artistic, and
cultural phenomenon of computer graphics in particular.
Hardcover. 336 pages; 75 b&w illustrations. April 2013.
SPAM: A Shadow History of the Internet by Finn Brunton
The vast majority of all email sent every day is spam, a variety of
idiosyncratically spelled requests to provide account information,
invitations to spend money on dubious products, and pleas to send cash
overseas. Most of it is caught by filters before ever reaching an inbox. Where does it come from? As Finn Brunton explains in Spam, it is
produced and shaped by many different populations around the world:
programmers, con artists, bots and their botmasters, pharmaceutical
merchants, marketers, identity thieves, crooked bankers and their
victims, cops, lawyers, network security professionals, vigilantes,
and hackers. Every time we go online, we participate in the system
of spam, with choices, refusals, and purchases whose consequences
we may not understand. This is a book about what spam is, how it works, and what it
means. Brunton provides a cultural history that stretches from pranks on early computer
networks to the construction of a global criminal infrastructure. The history of spam,
Brunton shows us, is a shadow history of the Internet itself, with spam emerging as the
mirror image of the online communities it targets. Brunton traces spam through three
epochs: the 1970s to 1995, and the early, noncommercial computer networks that became
the Internet; 1995 to 2003, with the dot-com boom, the rise of spam’s entrepreneurs, and
the first efforts at regulating spam; and 2003 to the present, with the war of algorithms—
spam versus anti-spam. Spam shows us how technologies, from email to search
engines, are transformed by unintended consequences and adaptations, and how online
communities develop and invent governance for themselves. Scholars and students in
the social history of the Internet, readers in information science, new media, and media
studies are the primary audience for this addition to the Infrastructures series.
Hardcover. 304 pages. April 2013.
THE AESTHETICS OF IMAGINATION IN DESIGN by Mads Nygaard Folkmann
In The Aesthetics of Imagination in Design, Mads Folkmann investigates design in both
material and immaterial terms. Design objects, Folkmann argues, will always be dual
phenomena—material and immaterial, sensual and conceptual, actual and possible.
Drawing on formal theories of aesthetics and the phenomenology of imagination, he
seeks to answer fundamental questions about what design is and how it works that are
often ignored in academic research. Folkmann considers three conditions in design: the
possible, the aesthetic, and the imagination. Imagination is a central formative power
behind the creation and the life of design objects; aesthetics describes the sensual,
conceptual, and contextual codes through which design objects communicate; the
concept of the possible—the enabling of new uses, conceptions, and perceptions—lies
behind imagination and aesthetics. The possible, Folkmann argues, is contained as a
structure of meaning within the objects of design, which act as part of our interface with
the world. Folkmann makes use of discourses that range from practice-focused accounts
of design methodology to cultural studies. Throughout, he offers concrete examples
to illustrate theoretical points. Folkmann’s philosophically informed account shows
design—in all its manifestations, from physical products to principles of organization—to
be an essential medium for the articulation and transformation of culture. Scholars and
students of design and Human Computer Interaction will welcome this contribution to the
Design Thinking, Design Theory series.
Hardcover. 272 pages. April 2013.
WALKING AND MAPPING: Artists as Cartographers by Karen O’Rourke
Contemporary artists beginning with Guy Debord and Richard Long have returned again
and again to the walking motif. Debord and his friends tracked the urban flows of Paris;
Long trampled a path in the grass and snapped a picture of the result (A Line Made by
Walking). Mapping is a way for us to locate ourselves in the world physically, culturally, or
psychologically; Debord produced maps like collages that traced the “psychogeography”
of Paris. Today, the convergence of global networks, online databases, and new tools
for location-based mapping coincides with a resurgence of interest in walking as an art
form. In Walking and Mapping, Karen O’Rourke explores a series of walking/mapping
projects by contemporary artists. Some chart “emotional GPS”; some use GPS for creating
“datascapes” while others use their legs to do “speculative mapping.” Many work with
scientists, designers, and engineers. O’Rourke offers close readings of these works—many
of which she was able to experience firsthand—and situates them in relation to landmark
works from the past half-century. She shows that the infinitesimal details of each of these
projects take on more significance in conjunction with others. Together, they form a new
entity, a dynamic whole greater than the sum of its parts. By alternating close study of
selected projects with a broader view of their place in a bigger picture, Walking and
Mapping itself maps a complex phenomenon. In the Leonardo Book series, this title will
be of particular interest to readers on art and new media.
Hardcover. 360 pages; 115 b&w photos. April 2013.
AN AESTHESIA OF NETWORKS: Conjunctive Experience in Art and Technology by Anna Munster
Today almost every aspect of life for which data exists can be rendered as a network.
Financial data, social networks, biological ecologies: all are visualized in links and nodes,
lines connecting dots. A network visualization of a corporate infrastructure could look
remarkably similar to that of a terrorist organization. In An Aesthesia of Networks, Anna
Munster argues that this uniformity has flattened our experience of networks as active
and relational processes and assemblages. She counters the “network anaesthesia”
that results from this pervasive mimesis by reinserting the question of experience,
or aesthesia, into networked culture and aesthetics. Rather than asking how humans
experience computers and networks, Munster asks how networks experience—what
operations they perform and undergo to change and produce new forms of experience.
Drawing on William James’s radical empiricism, she asserts that networked experience
is assembled first and foremost through relations, which make up its most immediately
sensed and perceived aspect. Munster critically considers a range of contemporary
artistic and cultural practices that engage with network technologies and techniques,
including databases and data mining, the domination of search in online activity,
and the proliferation of viral media through YouTube. These practices—from artists
who “undermine” data to musicians and VJs who use intranetworked audio and video
software environments—are concerned with the relationality at the core of today’s
network experience. Scholars and students in new media are the primary audience for An
Aesthesia of Networks, in the Technologies of Lived Abstraction series.
Hardcover. 248 pages. May 2013.
CROWDSOURCING by Daren C. Brabham
Ever since the term “crowdsourcing” was coined in 2006 by Wired by writer Jeff Howe,
group activities ranging from the creation of the Oxford English Dictionary to the
choosing of new colors for M&Ms have been labeled with this most buzz-generating of
media buzzwords. In this accessible but authoritative account, grounded in the empirical
literature, Daren Brabham explains what crowdsourcing is, what it is not, and how it
works. Crowdsourcing, Brabham tells us, is an online, distributed problem solving and
production model that leverages the collective intelligence of online communities for
specific purposes set forth by a crowdsourcing organization—corporate, government,
or volunteer. Uniquely, it combines a bottom-up, open, creative process with top-down
organizational goals. Crowdsourcing is not open source production, which lacks the topdown component; it is not a market research survey that offers participants a short list of
choices; and it is qualitatively different from predigital open innovation and collaborative
production processes, which lacked the speed, reach, rich capability, and lowered
barriers to entry enabled by the Internet. Brabham describes the intellectual roots of the
idea of crowdsourcing in such concepts as collective intelligence, the wisdom of crowds,
and distributed computing. He surveys the major issues in crowdsourcing, including crowd
motivation, the misconception of the amateur participant, crowdfunding, and the danger
of “crowdsploitation” of volunteer labor, citing real-world examples from Threadless,
InnoCentive, and other organizations. And he considers the future of crowdsourcing in
both theory and practice, describing its possible roles in journalism, governance, national
security, and science and health. General readers iin computing, business, information
sicence, and Internet studies will be especially interested in Crowdsourcing, in the
Essential Knowledge series.
Paperback. 176 pages. May 2013.
STEALTH ASSESSMENT: Measuring and Supporting Learning in Games by Valerie Shute and
Matthew Ventura
To succeed in today’s interconnected and complex world, workers need to be able to
think systemically, creatively, and critically. Equipping K-16 students with these twentyfirst-century competencies requires new thinking not only about what should be taught
in school but also about how to develop valid assessments to measure and support these
competencies. In Stealth Assessments, Valerie Shute and Matthew Ventura investigate an
approach that embeds performance-based assessments in digital games. They argue that
using well-designed games as vehicles to assess and support learning will help combat
students’ growing disengagement from school, provide dynamic and ongoing measures of
learning processes and outcomes, and offer students opportunities to apply such complex
competencies as creativity, problem-solving, persistence, and collaboration. Shute and
Ventura first discuss problems with such traditional assessment methods as multiplechoice questions; then review evidence relating to digital games and learning; and
illustrate the stealth assessment approach with a set of assessments they are developing
and embedding in the digital game Newton’s Playground. These stealth assessments are
intended to measure levels of creativity, persistence, and conceptual understanding of
Newtonian physics during game play. Finally, they consider future research directions
related to stealth assessment in education. In the John D. and Catherine T. MacArthur
Foundation Reports on Digital Media and Learning series, this title will be of particular
interest to education researchers.
Paperback. 80 pages. May 2013.
THE RINGTONE DIALECTIC: Economy and Cultural Form by Sumanth Gopinath
A decade ago, the customizable ringtone was ubiquitous. Almost any crowd of cellphone
owners could produce a carillon of tinkly, beeping, synthy, musicalized ringer signals.
Ringtones quickly became a multibillion-dollar global industry and almost as quickly
faded away. In The Ringtone Dialectic, Sumanth Gopinath charts the rise and fall of the
ringtone economy and assesses its effect on cultural production. Gopinath describes the
technical and economic structure of the ringtone industry, considering the transformation
of ringtones from monophonic, single-line synthesizer files to polyphonic MIDI files
to digital sound files and the concomitant change in the nature of capital and rent
accumulation within the industry. He discusses sociocultural practices that seemed
to wane as a result of these shifts, including ringtone labor, certain forms of musical
notation and representation, and the creation of contemporary musical and artistic works
quoting ringtones. In a series of studies, Gopinath examines “declines,” “reversals,”
and “revivals” of cultural forms associated with the ringtone and its changes, including
the Crazy Frog fad, the use of ringtones in political movements (as in the Philippine
“Gloriagate” scandal), the ringtone’s narrative function in film and television (including
its striking use in the films of the Chinese director Jia Zhangke), and the ringtone’s
relation to pop music (including possible race and class aspects of ringtone consumption).
Finally, Gopinath considers the attempt to rebrand ringtones as “mobile music” and the
emergence of cloud computing. Students and practitioners in the fields of new media
studies and information and communication studies constitute the main audience for The
Ringtone Dialectic.
Hardcover. 416 pages; 33 b&w illustrations. July 2013.
THE WELL-PLAYED GAME: A Player’s Philosophy by Bernard De Koven
In The Well-Played Game, games guru Bernard De Koven explores the interaction of
play and games, offering players—and game designers, educators, and scholars—a guide
to how games work. De Koven’s classic treatise on how human beings play together,
first published in 1978, investigates many issues newly resonant in the era of video and
computer games, including social gameplay, educational games, and player modification.
(Why not change the rules in pursuit of new ways to play?) The digital game industry,
now moving beyond its emphasis on graphic techniques to focus on player interaction,
has much to learn from The Well-Played Game. De Koven explains that when players
congratulate each other on a “well-played” game, they are expressing a unique and
profound synthesis that combines the concepts of play (with its associations of playfulness
and fun) and game (with its associations of rule-following). De Koven—affectionately
and appreciatively hailed by Eric Zimmerman as “our shaman of play”—explores the
experience of a well-played game, how we share it, and how we can experience it again;
issues of cheating, fairness, keeping score, changing old games, and making up new
games; and playing for keeps and winning. His book belongs on the shelf of players who
want to find a game in which they can play well, who are looking for others with whom
they can play well, and who have discovered the relationship between the well-played
game and the well-lived life. In addition to a general audience, readers in game studies
will welcome the reissue of this classic title.
Hardcover. 176 pages. August 2013.
BEYOND CHOICES: The Design of Ethical Gameplay by Miguel Sicart
Despite their current commercial success and their growing presence in the world’s
cultural landscape, computer games are facing a maturity crisis. On the one hand, digital
games propel a multimillion-dollar industry that pushes the boundaries of technology
and computing; on the other hand, the technical prowess of computer games is seldom
matched with deep, nuanced experiences that appeal to mature, more demanding
audiences. There have been cases of successful games that challenged conventions by
suggesting gameplay experiences based on ethical and political notions. Beyond Choices
is an in-depth reflection on the aesthetic and technical possibilities of digital games
as ethical experiences. Drawing on a wide variety of theoretical approaches, from
philosophy and game studies to design research and human-computer interaction, the
book provides theoretical and practical insights on the ways computer games are and can
be used to create engaging experiences that appeal and challenge the players’ values.
The book will focus on analyzing the design of games that have arguably succeeded in
creating mature ethical experiences. Scholars and students in new media concerned with
game design and game designers and developers make up the core audience for Beyond
Choices.
Hardcover. 200 pages. September 2013.
PHANTASMAL MEDIA: An Approach to Imagination, Computation, and Expression by D. Fox
Harrell
In Phantasmal Media, D. Fox Harrell considers the expressive power of computational
media. He argues that the great expressive potential of computational media comes
from the ability to construct and reveal phantasms—blends of cultural ideas and sensory
imagination. These ubiquitous and often-unseen phantasms—cognitive phenomena
that include sense of self, metaphors, social categories, narrative, and poetic
thinking—influence almost all our everyday experiences. Harrell offers an approach for
understanding and designing computational systems that have the power to evoke these
phantasms, paying special attention to the exposure of oppressive phantasms and the
creation of empowering ones. He argues for the importance of cultural content, diverse
worldviews, and social values in computing. The expressive power of phantasms is not
purely aesthetic, he contends; phantasmal media can express and construct the types
of meaning central to the human condition. Harrell discusses, among other topics, the
phantasm as an orienting perspective for developers; cultural phantasms that influence
consensus and reveal other perspectives; computing systems based on cultural models;
interaction and expression; and the ways that real world information is mapped onto,
and instantiated by, computational data structures. The concept of phantasmal media,
Harrell argues, offers new possibilities for using the computer to understand and improve
the human condition through the human capacity to imagine. Scholars and students of
new media studies, art and digital humanities make up the core audience for Phantasmal
Media.
Hardcover. 275 pages. September 2013.
SPECULATIVE EVERYTHING: Design, Fiction, and Social Dreaming by Anthony Dunne and Fiona
Raby
What happens when you decouple design from the marketplace, when rather than
making technology sexy, easy to use and more consumable, designers use the language
of design to pose questions, inspire, and provoke —to transport our imaginations into
parallel but possible worlds? Speculative Everything refers to the idea that we need more
imagination and speculation not only in design but also in our everyday lives and in areas
such as politics, economics and other disciplines. The book explores new ways design
can make technology more meaningful and relevant to our lives, both now and in the
future, by thinking not only about new applications, but implications as well. It unpacks
the interconnections between conceptual, critical, and speculative design in relation to
science and emerging technologies such as biotechnology. It moves from a general setting
out of what conceptual design is, through its use as a critical medium, to a facilitator
of debate around the implications of new developments in science, to a catalyst for
collaborative speculation with other disciplines. Dunne and Raby are among the most
well-known design thinkers in the world today and their ideas on the role and possibilities
of design will be of interest to designers, students, and scholars of design in particular.
Hardcover. 200 pages. September 2013.
THE AESTHETICS OF INTERACTION IN DIGITAL ART by Katja Kwastek
Since the 1960s, artworks that involve the participation of the spectator have received
extensive scholarly attention. Yet interactive artworks using digital media still present
a challenge for academic art history. In Aesthetics of Interaction in Digital Art, Katja
Kwastek argues that the particular aesthetic experience enabled by these new media
works can open up new perspectives for our understanding of art and media alike.
Kwastek, herself an art historian, offers a set of theoretical and methodological tools
that are suitable for understanding and analyzing not only new media art but also other
contemporary art forms. Addressing both the theoretician and the practitioner, Kwastek
provides an introduction to the history and the terminology of interactive art, a theory
of the aesthetics of interaction, and exemplary case studies of interactive media art. She
discusses topics such as real space and data space, temporal structures, instrumental and
phenomenal perspectives, and the relationship between materiality and interpretability.
Finally, she applies her theory to specific works of interactive media art, including
narratives in virtual and real space, interactive installations, and performance—with case
studies of works by, among others, Olia Lialina, Susanne Berkenheger, Teri Rueb, Lynn
Hershman, Tmema, David Rokeby, and Blast Theory. Scholars in the humanities, students
of digital performance, and interactive art practitioners make up the core audience for
this book.
Hardcover. 380 pages; 80 b&w illustrations. September 2013. The MIT Press holds all
rights with the exception of German language rights.
THE CIVIC WEB: Young People, the Internet, and Civic Participation by Shakuntala Banaji and
David Buckingham
Over the past two decades, there has been widespread concern across Europe—and in
many other industrialized countries— about an apparent decline in civic and political
participation. Commentators have pointed to long-term reductions in voting rates,
declining levels of trust in politicians, and waning interest in civic affairs—phenomena
that are frequently seen as evidence of a fundamental crisis in democracy. These
characteristics are generally seen to be most apparent among the young. Some have
looked optimistically to new media—and particularly the internet —as a means of reengaging young people, thereby revitalizing civic life and democracy. The Civic Web
is based on an extensive pan-European research project that explored the role of the
Internet as a means of promoting civic engagement and participation among young people
aged 15-25. The authors examine the types of civic political web sites for young people
that are available; the reasons why such sites are being made, and the organizations
that make them; the interpretations, beliefs, and on- and off-line actions of the young
people who visit them; and why certain sites and civic organizations are more successful
at engaging young people than others. The countries considered include: Hungary, the
Netherlands, Spain, Sweden, Slovenia, Turkey, and the UK. Researchers in digital media
and learning and students of these and related topics make up the core audience for this
book published in collaboration with The John D. and Catherine T. MacArthur Foundation
in their Digital Media and Learning series.
Hardcover. 240 pages; 30 b&w illustrations. September 2013.
COLLABORATIVE MEDIA: Production, Consumption, and Design Interventions by Jonas Löwgren
and Bo Reimer
For better or worse, the American Dialect Society voted “hashtag” as the word of the
year for 2012. This is an excellent example of the core argument of Collaborative Media:
the idea that media goes beyond the traditional model of a producer distributing the
same media product to a large number of producers. Instead, in collaborative media,
production is distributed so that people can, and do, engage in media content production.
Jonas Löwgren, an Interaction Designer, and Bo Reimer, a Media Studies professor, explore
this relatively new phenomenon via a series of case studies that illustrate the change
between the traditional model of software design, where design was to be complete
before release, and a model where you launch the product in a skeletal form, relying on
early adopters to engage in the continuous design of the platform. In the words of the
authors, a process of “perpetual beta.” Academics and researchers in Human Computer
Interaction and readers on media and communication studies make up the core audience
for Collaborative Media.
Hardcover. 248 paghes; 45 b&w illustrations. October 2013.
COMPUTER SCIENCE, ROBOTICS, ARTIFICIAL INTELLIGENCE, MATHEMATICS
ALGORITHMS DEMYSTIFIED by Thomas H. Cormen
Have you ever wondered how your GPS can find the fastest way to your destination,
selecting one route from seemingly countless possibilities in mere seconds? How your
credit card account number is protected when you make a purchase over the Internet?
The answer is algorithms. And how do these mathematical formulations translate
themselves into your GPS, your laptop, or your smart phone? This book offers an
engagingly written guide to the basics of computer algorithms. In Algorithms Demystified,
Thomas Cormen—coauthor of the leading college textbook on the subject—provides a
general explanation, with limited mathematics, of how algorithms enable computers to
solve problems. Readers will learn what computer algorithms are, how to describe them,
and how to evaluate them. They will discover simple ways to search for information
in a computer; methods for rearranging information in a computer into a prescribed
order (“sorting”); how to solve basic problems that can be modeled in a computer
with a mathematical structure called a “graph” (useful for modeling road networks,
dependencies among tasks, and financial relationships); how to solve problems that ask
questions about strings of characters such as DNA structures; the basic principles behind
cryptography; fundamentals of data compression; and even that there are some problems
that no one has figured out how to solve on a computer in a reasonable amount of time.
Undergraduate and graduate students in non-technical fields, and a general audience
interested in algorithms make up the core audience for this monograph.
Hardcover. 240 pages. March 2013.
ROBOT FUTURES by Illah Reza Nourbakhsh
With robots, we are inventing a new species that is part material and part digital. The
ambition of modern robotics goes beyond copying humans, beyond the effort to make
walking, talking androids that are indistinguishable from people. Future robots will have
superhuman abilities in both the physical and digital realms. They will be embedded in
our physical spaces, with the ability to go where we can’t, and will have minds of their
own, thanks to artificial intelligence. They will be fully connected to the digital world,
far better at carrying out online tasks than we are. In Robot Futures, the roboticist Illah
Nourbakhsh considers how we will share our world—our physical and digital worlds—with
these creatures, and how our society could change as it incorporates a race of stronger,
smarter beings. Nourbakhsh imagines a future that includes adbots offering interactive
custom messaging; robotic flying toys that operate by means of “gaze tracking”; robotenabled multimodal, multicontinental telepresence; and even a way that nanorobots
could allow us to assume different physical forms. Nourbakhsh follows each glimpse
into the robotic future with an examination of the underlying technology and an
exploration of the social consequences of the scenario. Each chapter describes a form of
technological empowerment—in some cases, empowerment run amok, with corporations
and institutions amassing even more power and influence and individuals unconstrained
by social accountability. (Imagine the hotheaded discourse of the Internet taking physical
form.) Nourbakhsh also offers a counter-vision: a robotics designed to create civic and
community empowerment. His book helps us understand why that is the robot future we
should try to bring about. Robot Futures is written for a broad general audience.
Hardcover. 160 pages. March 2013.
PROGRAMMING DISTRIBUTED COMPUTING SYSTEMS: A Foundational Approach by Carlos A. Varela
Starting from the premise that understanding the foundations of concurrent programming
is key to developing distributed computing systems, this book first presents the
fundamental theories of concurrent computing and then introduces the programming
languages that help develop distributed computing systems at a high level of abstraction.
The major theories of concurrent computation—including the π-calculus, the actor
model, the join calculus, and mobile ambients—are explained with a focus on how they
help design and reason about distributed and mobile computing systems. The book then
presents programming languages that follow the theoretical models already described,
including Pict, SALSA, and JoCaml. The parallel structure of the chapters in both part
one (theory) and part two (practice) enable the reader not only to compare the different
theories but also to see clearly how a programming language supports a theoretical
model. Programming Distributed Computing Systems is unique in bridging the gap
between the theory and the practice of programming distributed computing systems. It
can be used as a textbook for graduate and advanced undergraduate students in computer
science or as a reference for researchers in the area of programming technology for
distributed computing.
Hardcover. 314 pages; 91 b&w illustrations. June 2013.
FUNCTIONAL DIFFERENTIAL GEOMETRY by Gerald Jay Sussman and Jack Wisdom
Physics is naturally expressed in mathematical language. Students new to the subject
must simultaneously learn an idiomatic mathematical language and the content that
is expressed in that language. It is as if they were asked to read Les Misérables while
struggling with French grammar. This book offers an innovative way to learn the
differential geometry needed as a foundation for a deep understanding of general
relativity or quantum field theory as taught at the college level. The approach taken
by the authors (and used in their classes at MIT for many years) differs from the
conventional one in several ways, including an emphasis on the development of the
covariant derivative and an avoidance of the use of traditional index notation for tensors
in favor of a semantically richer language of vector fields and differential forms. But the
biggest single difference is the authors’ integration of computer programming into their
explanations. By programming a computer to interpret a formula, the student soon learns
whether or not a formula is correct. Students are led to improve their program, and as
a result improve their understanding. Advanced students and researchers in the physical
sciences and mathematics make up the audience for this book.
Hardcover. 256 pages; 8 b&w illustrations. July 2013.
MATHEMATICAL MODELING IN SYSTEMS BIOLOGY: An Introduction by Brian P. Ingalls.
Systems techniques are integral to current research in molecular cell biology, and
system-level investigations are often accompanied by mathematical models. These
models serve as working hypotheses: they help us to understand and predict the
behavior of complex systems. Mathematical Modeling in Systems Biology offers an
introduction to mathematical concepts and techniques needed for the construction and
interpretation of models in molecular systems biology. The first four chapters cover the
basics of mathematical modeling in molecular systems biology. The last four chapters
address specific biological domains, treating modeling of metabolic networks, of signal
transduction pathways, of gene regulatory networks, and of electrophysiology and
neuronal action potentials. Chapters 3–8 end with optional sections that address more
specialized modeling topics. Exercises, solvable with pen-and-paper calculations, appear
throughout the text to encourage interaction with the mathematical techniques. More
involved end-of-chapter problem sets require computational software. Appendixes
provide a review of basic concepts of molecular biology, additional mathematical
background material, and tutorials for two computational software packages (XPPAUT and
MATLAB) that can be used for model simulation and analysis. The intended audience for
this title consists of upper-level undergraduate and graduate students in life science or
engineering who have some familiarity with calculus, and will be a useful reference for
researchers at all levels.
Hardcover. 356 pages. July 2013.
HUMAN ROBOTICS: Neuromechanics and Motor Control by Etienne Burdet, David W. Franklin,
and Theodore E. Milner
This book proposes a transdisciplinary approach to investigating human motor control that
synthesizes musculoskeletal biomechanics and neural control. The authors argue that this
integrated approach—which uses the framework of robotics to understand sensorimotor
control problems—offers a more complete and accurate description than either a purely
neural computational approach or a purely biomechanical one. The authors offer an
account of motor control in which explanatory models are based on experimental
evidence using mathematical approaches reminiscent of physics. These computational
models yield algorithms for motor control that may be used as tools to investigate or
treat diseases of the sensorimotor systems and to guide the development of algorithms
and hardware that can be incorporated into products designed to assist with the tasks of
daily living. The authors focus on the insights their approach offers in understanding how
movement of the arm is controlled and how the control adapts to changing environments.
The book begins with muscle mechanics and control, progresses in a logical manner to
planning and behavior, and describes applications in neurorehabilitation and robotics. The
material is self-contained, and accessible to researchers and professionals in a range of
fields, including psychology, kinesiology, neurology, computer science, and robotics.
Hardcover. 304 pages. August 2013.
INTRODUCTION TO COMPUTATION AND PROGRAMMING USING PYTHON by John V. Guttag
This book introduces students with little or no prior programming experience to the art
of computational problem solving using Python and various Python libraries, including
PyLab. It provides students with skills that will enable them to make productive use of
computational techniques, including some of the tools and techniques of “data science”
for using computation to model and interpret data. Introduction to Computation and
Programming Using Python is based on an MIT course (which became the most popular
course offered through MIT’s OpenCourseWare) and was developed for use not only in a
conventional classroom but in in a massive open online course (or MOOC) offered by the
pioneering MIT–Harvard collaboration edX. Students are introduced to Python and the
basics of programming in the context of such computational concepts and techniques
as exhaustive enumeration, bisection search, and efficient approximation algorithms.
The book does not require knowledge of mathematics beyond high school algebra, but
does assume that readers are comfortable with rigorous thinking and not intimidated
by mathematical concepts. Although it covers such traditional topics as computational
complexity and simple algorithms, the book focuses on a wide range of topics not
found in most introductory texts, including information visualization, simulations to
model randomness, computational techniques to understand data, and statistical
techniques that inform (and misinform) as well as two related but relatively advanced
topics: optimization problems and dynamic programming. This volume can serve as a
stepping-stone to more advanced computer science courses, or as a basic grounding in
computational problem solving for students in other disciplines.
Paperback. 296 pages; 117 b&w illustrations. September 2013.
THE OUTER LIMITS OF REASON: What Science, Mathematics, and Logic Cannot Tell Us by Noson
S. Yanofsky
Many books explain what is known about the universe. This book investigates what
cannot be known. Rather than exploring the amazing facts that science, mathematics,
and reason have revealed to us, its focus is on what science, mathematics, and reason
tell us cannot be revealed. In The Outer Limits of Reason, Noson Yanofsky considers
what cannot be predicted, described, or known, and what will never be understood.
Moving from the concrete to the abstract, from problems of everyday language to
straightforward philosophical questions to the formalities of physics and mathematics,
Yanofsky demonstrates a myriad of unsolvable problems and paradoxes. He discusses
the limitations of computers, physics, logic, and our own thought processes. Yanofsky
describes simple tasks that would take computers trillions of centuries to complete and
other problems that computers can never solve; perfectly formed English sentences
that make no sense; different levels of infinity; the bizarre world of the quantum; the
relevance of relativity theory; the causes of chaos theory; math problems that cannot
be solved by normal means; and statements that are true but cannot be proven. Many of
these limitations have a similar pattern; by investigating these patterns, we can better
understand the structure and limitations of reason itself. Accessible to an educated
general audience, The Outer Limits of Reason will also be of interest to undergraduate
students in logic, science, mathematics, and philosophy.
Hardcover. 328 pages. September 2013.
FINITE STATE MACHINES IN HARDWARE: Theory and Design (with VHDL and Verilog) by Volnei A.
Pedroni
Modern, complex digital systems invariably include hardware-implemented finite
state machines. The correct design of such parts is crucial for attaining proper system
performance. This book offers detailed, comprehensive coverage of the theory and
design for any category of hardware-implemented finite state machines. It describes
crucial design problems that lead to incorrect or far from optimal implementation and
provides examples of finite state machines developed in both VHDL and SystemVerilog
(the successor of Verilog) hardware description languages. Important features include:
extensive review of design practices for sequential digital circuits; a new division of
all state machines into three hardware-based categories, encompassing all possible
situations, with numerous practical examples provided in all three categories; the
presentation of complete designs, with detailed VHDL and SystemVerilog codes,
comments, and simulation results, all tested in FPGA devices; and exercise examples,
all of which can be synthesized, simulated, and physically implemented in FPGA boards.
Additional material is available on the book’s Web site. This book offers the most detailed
coverage of finite state machines available. It will be essential for industrial designers of
digital systems and for students of electrical engineering and computer science students.
Hardcover. 360 pages. November 2013.
COGNITIVE SCIENCE, NEUROSCIENCE, PHILOSOPHY OF MIND
THE NEURAL BASIS OF FREE WILL: Criterial Causation by Peter Ulric Tse
The issues of mental causation, consciousness, and free will have vexed philosophers
since Plato. In this book, Peter Tse examines these unresolved issues from a
neuroscientific perspective. In contrast with philosophers who use logic rather than
data to argue whether mental causation or consciousness can exist given unproven first
assumptions, Tse proposes that we instead listen to what neurons have to say. Because the
brain must already embody a solution to the mind–body problem, why not focus on how
the brain actually realizes mental causation? Tse draws on exciting recent neuroscientific
data concerning how informational causation is realized in physical causation at the level
of NMDA receptors, synapses, dendrites, neurons, and neuronal circuits. He argues that a
particular kind of strong free will and “downward” mental causation are realized in rapid
synaptic plasticity. Recent neurophysiological breakthroughs reveal that neurons function
as criterial assessors of their inputs, which then change the criteria that will make other
neurons fire in the future. Such informational causation cannot change the physical basis
of information realized in the present, but it can change the physical basis of information
that may be realized in the immediate future. This gets around the standard argument
against free will centered on the impossibility of self-causation. Researchers and graduate
students in neuroscience, especially visual and cognitive neuroscience, make up the core
audience for The Neural Basis of Free Will.
Hardcover. 320 pages; 28 b&w illustrations. February 2013.
EXPLAINING THE COMPUTATIONAL MIND by Marcin Milkowski
In this book, Marcin Milkowski argues that the mind can be explained computationally
because it is itself computational—whether it engages in mental arithmetic, parses
natural language, or processes the auditory signals that allow us to experience music.
Defending the computational explanation against objections to it—from John Searle and
Hilary Putnam in particular—Milkowski writes that computationalism is here to stay but
is not what many have taken it to be. It does not, for example, rely on a Cartesian gulf
between software and hardware, or mind and brain. Milkowski sketches a mechanistic
theory of implementation of computation against a background of extant conceptions,
describing four dissimilar computational models of cognition. Instead of arguing that
there is no computation without representation, he inverts the slogan and shows that
there is no representation without computation—but explains that representation
goes beyond purely computational considerations. Milkowski’s arguments succeed in
vindicating computational explanation in a novel way by relying on mechanistic theory
of science and interventionist theory of causation. Cognitive scientists and students, in
particular those focusing on computational neuroscience, are the core audience for this
title.
Hardcover. 248 pages. March 2013.
SPACE TO REASON: A Spatial Theory of Human Thought by Markus
Knauff
Many people, including many scholars, believe that human
reasoning relies on visual imagination. Markus Knauff shows that
visual mental images are not relevant for reasoning and can even
impede the process of thought. Other scholars claim that human
thinking is solely based on abstract symbols and is completely
embedded into language. Knauff also argues against this view and
shows that reasoning requires going beyond language. In Space
to Reason, Markus Knauff proposes a third way to think about
human reasoning that relies on supramodal spatial representations
as being at the heart of human thought, even thought about
non-spatial properties of the world. These spatial layout models
are more abstract than visual images and more concrete than
language-like symbolic representations. Many reasoning problems are ambiguous and
thus interpretable in several different ways. To deal with this problem, Knauff introduces
the notion of a preferred layout model, which is one specific spatial layout model among
many others that has the best chance of being mentally constructed. Importantly, this
model preserves just spatial information without incorporating pictorial features that
are normally presented in visual images. Readers in cognitive science and students and
researches in applied technology such as artificial intelligence and spatial cognition make
up the core audience for this monograph.
Hardcover. 320 pages. March 2013.
THE HAND, AN ORGAN OF THE MIND: What the Manual Tells the Mental. Edited by Zdravko
Radman.
Cartesian-inspired dualism enforces a theoretical distinction between the motor and
the cognitive and locates the mental exclusively in the head. This collection of original
essays, focusing on the hand, challenges this dichotomy, offering theoretical and
empirical perspectives on the interconnectedness and interdependence of the manual and
mental. The contributors explore the possibility that the hand, far from being the merely
mechanical executor of preconceived mental plans, possesses its own know-how, enabling
“enhanded” beings to navigate the natural, social, and cultural world without engaging
propositional thought, consciousness, and deliberation. The contributors consider not
only broad philosophical questions--ranging from the nature of embodiment, enaction,
and the extended mind to the phenomenology of agency—but also such specific issues as
touching, grasping, gesturing, sociality, and simulation. They show that the capacities
of the hand include perception (on its own and in association with other modalities),
action, (extended) cognition, social interaction, and communication. Taken together,
their accounts offer a handbook of cutting-edge research that explores the ways that the
manual shapes and reshapes the mental and creates conditions for embodied agents to
act in the world. Readers in the philosophy of mind and in the cognitive sciences more
broadly make up the core audience for this collection.
Hardcover. 464 pages. April 2013.
COMMUNICATING MORAL CONCERN: An Ethics of Critical Responsiveness by Elise Springer
Modern moral theories have crystallized around the logic of individual choices, abstracted
from social and historical context. Yet moral theories can always be understood as a
responsive intervention in the social world out of which they emerge. In this novel
account of moral agency, Elise Springer argues that our participation in moral life is
bound up with our social responsiveness to the activity around us. To notice and address
what others are doing with their moral agency is to exercise what Springer calls critical
responsiveness. This approach to moral reflection frees moral theory from its association
with both righteous detachment—which places reactive attitudes and judgments at the
center of our moral responsiveness—and agenda-driven interventions—which sideline
the agency of those whose behavior we presume to correct. Springer’s account shows
how critical responsiveness might function as a practical engagement between agents,
reaching further than expressive representation but not as far as causal control. The
moral work she recommends is to draw our existing cacophony of responsive habits
into a more reflective critical practice, cultivating what she calls a “virtue of critical
engagement.” Scholars and students of philosophy focusing on ethics are the primary
audience for Communicating Moral Concern.
Hardcover. 328 pages. May 2013.
FEELING BEAUTY: The Sister Arts and the Neuroscience of Aesthetic Experience by Gabrielle G.
Starr.
Within the neurosciences, neuroaesthetics—the study of the neural bases of aesthetic
experience—is emerging as a fascinating subdiscipline. Feeling Beauty is the first attempt
to offer a theory of not just a single aesthethic emotion, or of a single artform, but to
provide a flexible, broad, yet rigorous understanding of the neuroscience of aesthetics
across the arts. Focusing on music, painting, and poetry, the author elaborates a model
for understanding the dynamic and changing features of aesthetic life, the relationships
among the arts, and how individual differences in aesthetic judgment shape the varieties
of aesthetic experience. Gabrielle Starr draws on experimental work in neuroscience
(her own and that of other researchers), on the history of philosophy, and on the critical
traditions of art, poetry, and music. Because of the author’s own background as a literary
scholar and historian of aesthetics and her training in neuroscience, Feeling Beauty is the
first book on aesthetics to speak to both humanists and scientists, taking into account the
complexities of both the physical instantiation of aesthetics and of the realities of artistic
interpretation. An audience of scholars and students interested in cognitive science,
psychology, neuroscience, as well as aesthetics and art, will welcome this title.
Hardcover. 272 pages. July 2013.
HOW THINGS SHAPE THE MIND: A Theory of Material Engagement by Lambros Malafouris
An increasingly influential school of thought in cognitive science views the mind as
embodied, extended, and distributed, rather than brain-bound, “all in the head.” This
shift in perspective raises crucial questions about the relationship between cognition and
material culture, posing major challenges for philosophy, cognitive science, archaeology,
and anthropology. In How Things Shape the Mind, Lambros Malafouris proposes a crossdisciplinary analytical framework for investigating the different ways that things have
become cognitive extensions of the human body, and, using a variety of examples and
case studies, traces how those ways might have changed from earliest prehistory to
the present. Malafouris’s Material Engagement Theory adds materiality—the world of
things, artifacts, and material signs—into the cognitive equation. His account not only
questions conventional intuitions about the boundaries and location of the human mind
but also suggests that we rethink classical archaeological assumptions about human
cognitive evolution. Arguing that the understanding of human cognition is essentially
interlocked with the study of the technical mediations that constitute the central nodes
of a materially extended and distributed human mind, Malafouris offers a series of
archaeological and anthropological case studies—from Stone Age tools to the modern
potter’s wheel—to test his theory. How do things shape the mind? Considering the
implications of the seemingly uniquely human predisposition to reconfigure our bodies
and our senses by using tools and material culture, Malafouris adds a fresh perspective on
a foundational issue in the study human cognition. In addition to readers in the cognitive
sciences and philosophy of mind, How Things Shape the Mind will be of interest to an
interdisciplinary audience that includes students of archaeology, anthropology, and
material culture.
Hardcover. 304 pages. July 2013.
MATTER AND CONSCIOUSNESS, third edition, by Paul M. Churchland
In Matter and Consciousness, Paul Churchland presents a concise and contemporary
overview of the philosophical issues surrounding the mind and explains the main theories
and philosophical positions that have been proposed to solve them. Making the case
for the relevance of theoretical and experimental results in neuroscience, cognitive
science, and artificial intelligence for the philosophy of mind, Churchland reviews current
developments in the cognitive sciences and offers a clear and accessible account of the
connections to philosophy of mind. For this third edition, the text has been updated and
revised throughout. The changes range from references to the iPhone’s “Siri” to expanded
discussions of the work of such contemporary philosophers as David Chalmers, John
Searle, and Thomas Nagel. Churchland describes new research in evolution, genetics, and
visual neuroscience, among other areas, arguing that the philosophical significance of
these new findings lies in the support they tend to give to the reductive and eliminative
versions of materialism. Matter and Consciousness, written by the most distinguished
theorist and commentator in the field, offers an authoritative summary and sourcebook
for issues in philosophy of mind. It is suitable for use as an introductory undergraduate
text.
Hardcover. 240 pages. July 2013.
MINDVAULTS: Sociocultural Grounds for Pretending and Imagining by Radu J. Bogdan
The human mind has the capacity to vault over the realm of current perception,
motivation, emotion, and action, to leap—consciously and deliberately—to past or future,
possible or impossible, abstract or concrete scenarios and situations. In this book, Radu
Bogdan examines the roots of this uniquely human ability, which he terms “mindvaulting.”
He focuses particularly on the capacities of pretending and imagining, which he identifies
as the first forms of mindvaulting to develop in childhood. Pretending and imagining,
Bogdan argues, are crucial steps on the ontogenetic staircase to the intellect. Bogdan
finds that pretending and then imagining develop from a variety of sources for reasons
that are specific and unique to human childhood. He argues that these capacities arise as
responses to sociocultural and sociopolitical pressures that emerge at different stages of
childhood. Bogdan argues that some of the properties of mindvaulting—including domain
versatility and nonmodularity—resist standard evolutionary explanations. To resolve this
puzzle, Bogdan reorients the evolutionary analysis toward human ontogeny, construed as
a genuine space of evolution with specific pressures and adaptive responses. Bogdan finds
that pretending is an ontogenetic response to sociocultural challenges in early childhood,
a pre-adaptation for imagining; after age four, the adaptive response to cooperative and
competitive sociopolitical pressures is a competence for mental strategizing that morphs
into imagining. Scholars and students in the philosophy of mind make up the core of the
audience for this book.
Hardcover. 256 pages. July 2013.
CULTURAL EVOLUTION. Edited by Peter J. Richerson and Morten H. Christiansen.
Culture in its many manifestations—social organization, technology, science, language,
religion—is responsible for the striking difference between humans and other organisms as
well as for our ecological dominance of the Earth. Over the past few decades, a growing
body of research has emerged from avariety of disciplines highlighting the importance
of cultural evolution in our understanding of human behavior. Wider application of these
insights, however, has been hampered by traditional disciplinary boundaries. To remedy
this, key players from theoretical biology, developmental and cognitive psychology,
linguistics, anthropology, sociology, religious studies, history, and economics convened
to explore the central role of cultural evolution in human social structure, technology,
language, and religion. This resulting volume, consisting of original contributions,
synthesizes past and ongoing work on cultural evolution and sketches a roadmap for
future cross-disciplinary efforts. Cultural evolution can provide an important integrating
function across the various disciplines of the human sciences, similar to that of organic
evolution in biology. There are many aspects of human endeavor where our understanding
can be improved by adopting a cultural evolutionary perspective as demonstrated by the
sections on social systems, technology, language and religion. Scholars and students in
the cognitive sciences, and evolutionary theory and cultural science in particular, will be
interested in this volume.
Hardcover. 450 pages. August 2013.
STORYTELLING AND THE SCIENCE OF MIND by David Herman
With Storytelling and the Science of Mind, David Herman proposes a cross-fertilization
between the study of narrative and research on intelligent behavior. This crossfertilization goes beyond the simple importing of ideas from the sciences of mind into
scholarship on narrative and instead aims for convergence between work in narrative
studies and research in the cognitive sciences. The book as a whole centers on two
questions: How do people make sense of stories? And: How do people use stories to make
sense of the world? Using case studies that range from Robert Louis Stevenson’s Dr Jekyll
and Mr Hyde to sequences from The Incredible Hulk comics to narratives told in everyday
interaction, Herman considers storytelling both as a target for interpretation and as a
resource for making sense of experience itself. In doing so, he puts ideas from narrative
scholarship into dialogue with such fields as psycholinguistics, philosophy of mind, and
cognitive, social, and ecological psychology. After exploring ways in which interpreters of
stories can use textual cues to build narrative worlds, or storyworlds, Herman investigates
how this process of narrative worldmaking in turn supports efforts to understand—and
engage with—the conduct of persons, among other aspects of lived experience. Readers
in cognitive science and linguistics as well as students in disciplines such as sociology,
literary studies, and related disciplines make up the core audience for this book.
Hardcover. 400 pages. August 2013.
FEELING EXTENDED: Sociality as Extended Body-Becoming-Mind by Douglas Robinson
The extended-mind thesis (EMT), usually attributed to Andy Clark and David Chalmers,
proposes that in specific kinds of mind-body-world interaction there emerges an extended
cognitive system incorporating such extracranial supports as pencils, papers, computers,
and other objects and environments in the world. In Feeling Extended, Douglas Robinson
accepts the thesis, but argues that the usual debate over EMT—which centers on whether
mind really (literally, actually, materially) extends to body and world or only seems to—
oversimplifies the issue. When we say that mind feels as if it extends, Robinson argues,
what extends is precisely feeling—and mind, insofar as it arises out of feeling. Robinson
explores the world of affect and conation as intermediate realms of being between the
physical movements of body and the qualitative movements of mind. He shows that
affect is transcranial and tends to become interpersonal conation. Affective-becomingconative sociality, he argues, is in fact the primary area in which body-becoming-mind
extends. He draws on a wide spectrum of philosophical thought—from the EMT and qualia
debates among cognitivists to the prehistory of such debates in the work of Hegel and
Peirce to continental challenges to Hegelianism from Bakhtin and Derrida—as well as on
extensive empirical research in social psychology and important sociological theories of
face (Goffman), ritual (Connerton), and habitus (Bourdieu). Scholars and students in the
philosophy of mind and cognitive scientists more broadly, as well as readers in continental
philosophy, will welcome Feeling Extended.
Hardcover. 256 pages. September 2013.
NEUROSCIENCE OF CREATIVITY. Edited by Adam S. Bristol, Oshin Vartanian, and James C.
Kaufman.
This volume offers a comprehensive overview of the latest neuroscientific approaches to
the scientific study of creativity. In chapters that progress logically from neurobiological
fundamentals to systems neuroscience and neuroimaging, leading scholars describe the
latest theoretical, genetic, structural, clinical, functional, and applied research on the
neural bases of creativity. The treatment is both broad and in depth, offering a range of
neuroscientific perspectives with detailed coverage by experts in each area. Following
opening chapters that offer theoretical context, the contributors discuss such issues as
the heritability of creativity; creativity in patients with brain damage, neurodegenerative
conditions, and mental illness; clinical interventions and the relationship between
psychopathology and creativity; neuroimaging studies of intelligence and creativity;
neuroscientific basis of creativity-enhancing methodologies; and the informationprocessing challenges of viewing visual art. Neuroscientists and students of neuroscience,
psychology, and cognitive science more broadly will welcome this timely overview.
Hardcover. 272 pages. September 2013.
SCHIZOPHRENIA: Evolution and Synthesis. Edited by Steven M. Silverstein, Bita Moghaddam,
and Til Wykes.
Despite major advances in methodology and thousands of published studies every
year, treatment outcomes in schizophrenia have not improved over the last fifty years.
Moreover, we still lack strategies for prevention and we do not yet understand how the
interaction of genetic, developmental, and environmental factors contribute to the
disorder. In this book, leading researchers consider conceptual and technical obstacles
to progress in understanding schizophrenia and suggest novel strategies for advancing
research and treatment. The contributors address a wide range of critical issues: the
construct of schizophrenia itself; etiology, risk, prediction, and prevention; different
methods of modeling the disorder; and treatment development and delivery. They
identify crucial gaps in our knowledge and offer creative but feasible suggestions.
These strategies include viewing schizophrenia as a heterogeneous group of conditions;
adopting specific new approaches to prediction and early intervention; developing better
integration of data across genetics, imaging, perception, cognition, phenomenology, and
other fields; and moving toward an evidence-based, personalized approach to treatment
requiring rational clinical decision making to reduce functional disability. Neuroscientists,
cognitive scientists, and students and practitioners of psychiatry constitute the core
audience for this volume, in the Strüngmann Forum Reports series.
Hardcover. 400 pages. September 2013.
SCRIPTING READING MOTIONS: The Codex and the Computer as Self-Reflexive Machines by
Manuel Portela
In Scripting Reading Motions, Manuel Portela explores the expressive use of book forms
and programmable media in experimental works of both print and electronic literature
and finds a self-conscious play with the dynamics of reading and writing. Portela examines
a series of print and digital works by Johanna Drucker, Mark Z. Danielewski, Rui Torres,
Jim Andrews, and others, for the insights they yield about the semiotic and interpretive
actions through which readers produce meaning when interacting with codes. Analyzing
these works as embodiments and simulations of the motions of reading, Portela pays
particular attention to the ways in which awareness of eye movements and haptic
interactions in both media feeds back onto the material and semantic layers of the works.
These feedbacks, he argues, sustain self-reflexive loops that link the body of the reader
to the embodied work. Among the topics explored by the author are typographic and
graphic marks as choreographic notations for reading movements; digital recreations of
experimental print literary artifacts; reading motions in kinetic and generated texts; and
the relationship of bibliographic, linguistic, and narrative coding in Danielewski’s novelpoem, Only Revolutions. The expressive use of print and programmable media, Portela
shows, offers a powerful model of the semiotic, interpretive, and affective operations
embodied in reading processes. Scholars and students of new media, digital poetry,
book arts, and experimental forms of literature make up the core audience for Scripting
Reading Motions.
Hardcover. 320 pages; 96 b&w illustratons. September 2013.
THE COGNITIVE-EMOTIONAL BRAIN: From Interactions to Integration by Luiz Pessoa
The idea that a specific brain circuit constitutes the emotional brain (and its corollary,
that cognition resides elsewhere) shaped thinking about emotion and the brain for many
years. Recent behavioral, neuropsychological, neuroanatomical, and neuroimaging
research, however, suggests that emotion interacts with cognition in the brain. The
amygdala is often viewed as the quintessential emotional region of the brain, but
Pessoa reviews findings revealing that many of its functions contribute to attention
and decision making, critical components of cognitive functions. He counters the idea
of a subcortical pathway to the amygdala for affective visual stimuli with an alternate
framework, the multiple waves model. Citing research on reward and motivation, Pessoa
also proposes the dual competition model, which explains emotional and motivational
processing in terms of their influence on competition processes at both perceptual and
executive function levels. He considers the broader issue of structure-function mappings,
and examines anatomical features of several regions often associated with emotional
processing, highlighting their connectivity properties. As new theoretical frameworks
of distributed processing evolve, Pessoa concludes, a truly dynamic network view of
the brain will emerge, in which “emotion” and “cognition” may be used as labels in the
context of certain behaviors, but will not map cleanly into compartmentalized pieces of
the brain. Cognitive and computational neuroscientists and students make up the core
audience for this book.
Hardcover. 304 pages. September 2013.
RELIABILITY IN COGNITIVE NEUROSCIENCE: A Meta-Meta Analysis by William R. Uttal
Cognitive neuroscientists increasingly claim that brain images generated by new brain
imaging technologies reflect, correlate, or represent cognitive processes. In this book,
William Uttal warns against these claims, arguing that, despite its utility in anatomic and
physiological applications, brain imaging research has not provided consistent evidence
for correlation with cognition. Uttal bases his argument on an extensive review of the
empirical literature, pointing to variability in data not only among subjects within
individual experiments but also in the new meta-analytical approach that pools data from
different experiments. This inconsistency of results, he argues, has profound implications
for the field, suggesting that cognitive neuroscientists have not yet proven their
interpretations of the relation between brain activity captured by macroscopic imaging
techniques and cognitive processes; what may have appeared to be correlations may
have only been illusions of association. He supports the view that the true correlates are
located at a much more microscopic level of analysis: the networks of neurons that make
up the brain. He argues that although the idea seems straightforward, the task of pooling
data from different experiments is extremely complex, leading to uncertain results, and
that little is gained by it. Researchers and students in cognitive neuroscience particularly
and in related fields in cognition will welcome this timely book.
Hardcover. 254 pages. October 2013.
VISUAL PSYCHOPHYSICS: From Laboratory to Theory by Zhong-Lin Lu and Barbara Dosher
Vision is one of the most active areas in biomedical research, and visual psychophysical
techniques are a foundational methodology for this research enterprise. Visual
psychophysics, which studies the relationship between the physical world and human
behavior, is a classical field of study that has widespread applications in modern vision
science. Bridging the gap between theory and practice, this textbook provides a
comprehensive treatment of visual psychophysics, teaching not only basic techniques but
also sophisticated data analysis methodologies and theoretical approaches. It begins with
practical information about setting up a vision lab and goes on to discuss the creation,
manipulation, and display of visual images; timing and integration of displays with
measurements of brain activities and other relevant techniques; experimental designs;
estimation of behavioral functions; and examples of psychophysics in applied and clinical
settings. The book discusses the theoretical underpinnings of data analysis and scientific
interpretation, presenting data analysis techniques that include model fitting, model
comparison, and a general framework for optimized adaptive testing methods. It includes
many sample programs in Matlab with functions from Psychtoolbox, a free toolbox for
real-time experimental control. Graduate students and researchers in vision are the
primary audience for this textbook.
Hardcover. 400 pages. October 2013.
GENETIC INFLUENCE ON ADDICTION: An Intermediate Phenotype Approach. Edited by James
MacKillop and Marcus R. Munafo
Although the general scientific consensus holds that genetic factors play a substantial
role in an individual’s vulnerability to drug or alcohol addiction, specific genetic variables
linked to risk or resilience remain elusive. Understanding how genetic factors contribute
to addiction may require focusing on intermediary mechanisms, or intermediate
phenotypes, that connect genetic variation and risk for addiction. The intermediate
phenotype approach, which extends the established endophenotype approach, considers
all genetically informative phenotypes. This book offers a comprehensive review of this
mechanistic-centered approach and the most promising intermediate phenotypes. The
contributors first consider the most established findings in the field, including variability
in drug metabolism, brain electrophysiological profiles, and subjective reactions to
direct drug effects; they go on to review such highly promising areas as expectancies,
attentional processing, and behavioral economic variables; and finally, they investigate
more exploratory approaches, including the differential susceptibility hypothesis,
epigenetic modifications as potential intermediate phenotypes, and efforts to close the
gap between mouse and human genetics. Taken together, the chapters offer a macrolevel testing of the hypothesis that these alternative, mechanistic phenotypes can
advance addiction research. The book will be of interest to researchers and practitioners
in a range of disciplines, including behavioral genetics, psychology, pharmacology,
neuroscience, and sociology.
Hardcover. 352 pages. November 2013.
THE NEW VISUAL NEUROSCIENCES. Edited by John S. Werner and Leon M. Chalupa.
Visual science is the model system for neuroscience, its findings relevant to all other
areas. This essential reference to contemporary visual neuroscience covers the
extraordinary range of the field today, from molecules and cell assemblies to systems
and therapies. It provides a state-of-the art companion to the earlier book The Visual
Neurosciences (MIT Press, 2003). This volume covers the dramatic advances made in
the last decade, offering new topics, new authors, and new chapters. The New Visual
Neurosciences assembles groundbreaking research, written by international authorities.
Many of the 112 chapters treat seminal topics not included in the earlier book. These new
topics include retinal feature detection; cortical connectomics; new approaches to midlevel vision and spatiotemporal perception; the latest understanding of how multimodal
integration contributes to visual perception; new theoretical work on the role of neural
oscillations in information processing; and new molecular and genetic techniques for
understanding visual system development. An entirely new section covers invertebrate
vision, reflecting the importance of this research in understanding fundamental principles
of visual processing. Another new section treats translational visual neuroscience,
covering recent progress in novel treatment modalities for optic nerve disorders,
macular degeneration, and retinal cell replacement. The New Visual Neurosciences is
an indispensible reference for students, teachers, researchers, clinicians, and anyone
interested in contemporary neuroscience.
Hardcover. 2000 pages; 575 b&w illustrations. November 2013.
ENVIRONMENTAL SCIENCE, ENVIRONMENTAL POLICY, BIOETHICS, URBAN PLANNING
ECO-BUSINESS: A Big-Brand Takeover of Sustainability by Peter
Dauvergne and Jane Lister
McDonald’s promises to use only beef, coffee, fish, chicken, and
cooking oil obtained from sustainable sources. Coca-Cola promises to
achieve water neutrality. Unilever has set a deadline of 2020 to reach
100 percent sustainable agricultural sourcing. Walmart has pledged
to become carbon neutral. Today, big-brand companies seem to be
making commitments that go beyond the usual “greenwashing” efforts
undertaken largely for public relations purposes. In Eco-Business, Peter
Dauvergne and Jane Lister examine this new corporate embrace of
sustainability, its actual accomplishments, and the consequences for
the environment. For many leading-brand companies, these corporate
sustainability efforts go deep, reorienting central operations and extending through global
supply chains. Advocacy groups and governments are partnering with these companies,
eager to reap the governance potential of eco-business efforts. Yet, as Dauvergne
and Lister point out, these companies are using sustainability as a business tool. The
acclaimed eco-efficiencies achieved by big-brand companies limit the potential for finding
deeper solutions to pressing environmental problems and reinforce runaway consumption.
Eco-business promotes the sustainability of big business, not the sustainability of life
on earth. In addition to general readers, Eco-Business will be of interest to students of
political science, business, and sociology.
Hardcover. 208 pages; 4 b&w illustrations. February 2013.
THE ENVIRONMENTAL ADVANTAGES OF CITIES: Countering Commonsense Antiurbanism by
William B. Meyer
Conventional wisdom about the environmental impact of cities holds that urbanization
and environmental quality are necessarily at odds. Cities are seen to be sites of ecological
disruption, consuming a disproportionate share of natural resources, producing high
levels of pollution, and concentrating harmful emissions precisely where the population is
most concentrated. Cities appear to be particularly vulnerable to natural disasters, to be
inherently at risk from outbreaks of infectious diseases, and even to offer dysfunctional
and unnatural settings for human life. In this book, William Meyer tests these widely
held beliefs against the evidence. Borrowing some useful terminology from the public
health literature, Meyer weighs instances of “urban penalty” against those of “urban
advantage.” He finds that many supposed urban environmental penalties are illusory,
based on commonsense preconceptions and not on solid evidence. In fact, greater
degrees of “urbanness” often offer advantages rather than penalties. The characteristic
compactness of cities, for example, lessens the pressure on ecological systems and
enables resource consumption to be more efficient. On the whole, Meyer reports,
cities offer greater safety from environmental hazards (geophysical, technological, and
biological) than more dispersed settlement does. In fact, the city-defining characteristics
widely supposed to result in environmental penalties do much to account for cities’
environmental advantages. As of 2008 (according to U.N. statistics), more people live in
cities than in rural areas. Meyer’s analysis clarifies the effects of such a profound shift,
covering a full range of environmental issues in urban settings. Scholars and students in
environmental studies, urban studies, geography, planning, and sociology are the primary
audience for this book.
Hardcover. 248 pages. March 2013.
CLIMATE ENGINEERING by David Keith
Currently, David Keith has dual appointments as Gordon McKay Professor of Applied
Physics in the School of Engineering and Applied Sciences at Harvard University and
Professor of Public Policy at the Harvard Kennedy School. He has been working at the
intersection of climate science, energy technology, and public policy for the past two
decades. In Climate Engineering, he proposes a controversial “solution” for climate
change that focuses on cooling the earth by injecting particles into the atmosphere,
rather than the decreasing the carbon emissions that caused the problem. Keith offers
no naïve proposal for an easy fix to perhaps the most challenging question of our time.
Instead he argues that we must put the idea on the table because we have not yet
succeeded in reducing emissions significantly. This book is an effort to look at the solar
engineering responsibly—to explore its potential effectiveness and costs, its possible
unintended consequences, and the very morality of considering it as a technical fix to
climate change that may well undermine commitments to conserving energy reduce
emissions. General readers on environmental studies, environmental science, climatology,
current affairs, and public policy will welcome this original title in the Boston Review
series.
Hardcover. 112 pages. August 2013.
THE FUTURE IS NOT WHAT IT USED TO BE: Climate Change and Energy Scarcity by Jörg
Friedrichs
The future is not what it used to be because we can no longer rely on the comforting
assumption that it will resemble the past. Past abundance of fuel, for example, does not
imply unending abundance. Infinite growth on a finite planet is not possible. In this book,
Jörg Friedrichs argues that industrial society itself is transitory, and he examines the
prospects for our civilization’s coming to terms with its two most imminent choke points:
climate change and energy scarcity. He offers a thorough and accessible account of these
two challenges as well as the linkages between them. Friedrichs contends that industrial
civilization cannot outlast our ability to burn fossil fuels and that the demise of industrial
society would entail cataclysmic change, including population decreases. To understand
the social and political implications, he examines historical cases of climate stress and
energy scarcity: devastating droughts in the ancient Near East; the Little Ice Age in the
medieval Far North; the Japanese struggle to prevent “fuel starvation” from 1918 to
1945; the “totalitarian retrenchment” of the North Korean governing class after the end
of Soviet oil deliveries; and Cuba’s socioeconomic adaptation to fuel scarcity in the 1990s.
Friedrichs suggests that to confront our predicament we must affirm our core values
and take action to transform our way of life. Whether we are private citizens or public
officials, complacency is not an option: climate change and energy scarcity are emerging
facts of life. General readers and students of environmental science, political science,
and economics will be especially interested in The Future is Not What It Used to Be.
Hardcover. 224 pages; 4 b&w illustrations. August 2013.
SUSTAINABLE URBAN METABOLISM by Paulo C. Ferrão and John E. Fernández
Urbanization and globalization have shaped the last hundred years. These two dominant
trends are mutually reinforcing: globalization links countries through the networked
communications of urban hubs. The urban population now generates more than eighty
percent of global GDP. Cities account for enormous flows of energy and materials—inflows
of goods and services and outflows of waste. Thus urban environmental management
critically affects global sustainability. In this book, Paulo Ferrão and John Fernández
offer a metabolic perspective on urban sustainability, viewing the city as a metabolism,
in terms of its exchanges of matter and energy. Sustainable Urban Metabolism provides
a roadmap to the strategies and tools needed for a scientifically based framework for
analyzing and promoting the sustainability of urban systems. Using the concept of urban
metabolism as a unifying framework, Ferrão and Fernandez describe a systems-oriented
approach that establishes useful linkages among environmental, economic, social,
and technical infrastructure issues. These linkages lead to an integrated informationintensive platform that enables ecologically informed urban planning. After establishing
the theoretical background and describing the diversity of contributing disciplines, the
authors sample sustainability approaches and tools, offer an extended study of the
urban metabolism of Lisbon, and outline the challenges and opportunities in approaching
urban sustainability in both developed and developing countries. Scholars, students, and
practitioners in urban and regional planning and environmental engineering make up the
core audience for this book.
Hardcover. 232 pages. August 2013.
THE NEW SCIENCE OF CITIES by Michael Batty
In The New Science of Cities, Michael Batty suggests that to understand cities we
must view them not simply as places in space but as systems of networks and flows. To
understand space, he argues, we must understand flows, and to understand flows, we
must understand networks--the relations between objects that comprise the system
of the city. Drawing on the complexity sciences, social physics, urban economics,
transportation theory, regional science, and urban geography, and building on his own
previous work, Batty introduces theories and methods that reveal the deep structure of
how cities function. Batty presents the foundations of a new science of cities, defining
flows and their networks and introducing tools that can be applied to understanding
different aspects of city structure. He examines the size of cities, their internal order,
the transport routes that define them, and the locations that fix these networks. He
introduces methods of simulation that range from simple stochastic models to bottomup evolutionary models to aggregate land-use transportation models. Then, using largely
the same tools, he presents design and decision-making models that predict interactions
and flows in future cities. These networks emphasize a notion with relevance for
future research and planning: that design of cities is collective action. Researchers and
professionals in urban studies and planning, geography, and urban economics constitute
the core audience for The New Science of Cities.
Hardcover. 400 pages; 135 b&w illustratons. October 2013.
ECONOMICS, FINANCE, INTERNATIONAL RELATIONS, POLITICAL SCIENCE, BUSINESS
AMERICA’S ASSEMBLY LINE by David E. Nye
The assembly line was invented in 1913 and has been in continuous
operation ever since. It is the most familiar form of mass production.
Both praised as a boon to workers and condemned for exploiting them,
it has been celebrated and satirized. (We can still picture Chaplin’s
little tramp trying to keep up with a factory conveyor belt.) In
America’s Assembly Line, David Nye examines the industrial innovation
that made the United States productive and wealthy in the twentieth
century. The assembly line—developed at the Ford Motor Company in
1913 for the mass production of Model Ts—first created and then served
an expanding mass market. It inspired fiction, paintings, photographs,
comedy, cafeteria layouts, and cookie-cutter suburban housing. It also
transformed industrial labor and provoked strikes and union drives. During World War II
and the Cold War, it was often seen as a bastion of liberty and capitalism. By 1980, Japan
had reinvented the assembly line as a system of “lean manufacturing”; American industry
reluctantly adopted this new approach. Nye describes this evolution and the new global
landscape of increasingly automated factories, with fewer industrial jobs in America
and questionable working conditions in developing countries. A century after Ford’s
pioneering innovation, the assembly line continues to evolve toward more sustainable
manufacturing. In addition to general readers, students of business history, technology,
innovation, and related fields will welcome America’s Assembly Line.
Hardcover. 360 pages; 50 b&w illustrations. February 2013.
CHRONICLES FROM THE FIELD: The Townsend Thai Project by Robert M.
Townsend, Sombat Sakunthasathian, and Rob Jordan
Running since 1997 and continuing today, the Townsend Thai Project
has tracked millions of observations about the economic activities of
households and institutions in rural and urban Thailand. The project
represents one of the most extensive datasets in the developing
world. Chronicles from the Field offers an account of the design and
implementation of this unique panel data survey. It tells the story
not only of the origins and operations of the project but also of the
challenges and rewards that come from a search to understand the
process of a country’s economic development. The book explains
the technical details of data collection and survey instruments but
emphasizes the human side of the project, describing the culture shock felt by citydwelling survey enumerators in rural villages, the “surprising, eye-opening, and inspiring”
responses to survey questions, and the never-ending resourcefulness of the survey team.
The text is supplemented by an epilogue on research findings and policy recommendations
and an appendix that contains a list and abstracts of published and working papers,
organized by topic, using data from the project. Social and economic policies are too
often skewed by political considerations. The Townsend Thai Project offers another basis
for policy: accurate measurement based on thoroughly collected data. From this, a clear
template emerges for understanding poverty and alleviating it. Economists and students
of economics, particularly those with an interest in developing economics and South East
Asia, and social scientists more broadly will be interested in this unique book.
Hardcover. 160 pages. April 2012.
INTERMEDIATE PUBLIC ECONOMICS, second edition, by Jean Hindriks
and Gareth D. Myles
Public economics studies how government taxing and spending
activities affect the economy—economic efficiency and the distribution
of income and wealth. This comprehensive text on public economics
covers the core topics of market failure and taxation as well as
recent developments in both policy and the academic literature.
It is unique not only in its broad scope but in its balance between
public finance and public choice and its combination of theory and
relevant empirical evidence. Intermediate Public Economics covers the theory and
methodology of public economics; presents a historical and theoretical overview of the
public sector; and discusses such topics as departures from efficiency (including imperfect
competition and asymmetric information), issues in political economy, equity, taxation,
fiscal federalism, and tax competition among independent jurisdictions. Suggestions for
further reading, from classic papers to recent research, appear in each chapter, as do
exercises. The mathematics has been kept to a minimum without sacrificing intellectual
rigor; the book remains analytical rather than discursive. This second edition has been
thoroughly updated throughout. It offers new chapters on behavioral economics, limits to
redistribution, international taxation, cost–benefit analysis, and the economics of climate
policy. Additional exercises have been added and many sections revised in response to
advice from readers of the first edition. Advanced undergraduate and graduate students
of economics and public finance are the primary audience for this revised textbook.
Hardcover. 952 pages; 214 b&w illustrations. April 2013.
BANKING ON DEMOCRACY: Financial Markets and Elections in Emerging Countries by Javier
Santiso
Politics matter for financial markets and financial markets matter for politics, and
nowhere is this relationship more apparent than in emerging markets. In Banking on
Democracy, Javier Santiso investigates the links between politics and finance in countries
that have recently experienced both economic and democratic transitions. He focuses
on elections, investigating whether there is a “democratic premium”—whether financial
markets and investors tend to react positively to elections in emerging markets. Santiso
devotes special attention to Latin America, where over the last three decades many
countries became democracies, with regular elections, just as they also became open
economies dependent on foreign investment. Santiso’s analysis draws on a unique set of
primary databases (developed during his years at the OECD Development Centre) covering
an entire decade, more than 5,000 bank recommendations on emerging markets and fund
manager portfolio recommendations. Santiso examines the trajectory of Brazil through
its presidential elections of 2002, 2006, and 2010 and finds a decoupling of financial
and political cycles that occurred also in many other emerging economies. He charts
this evolution through the behavior of brokers, fund managers, bankers, and sovereign
wealth funds. Academics and students in political economy and students of finance and
investment make up the core audience for this book.
Hardcover. 336 pages; 84 b&w illustrations. June 2013.
MADE IN THE USA: The Rise and Retreat of American Manufacturing by Vaclav Smil
In Made in the USA, Vaclav Smil powerfully rebuts the notion that manufacturing is a
relic of pre-digital history and that the loss of American manufacturing is a desirable
evolutionary step toward a pure service economy. Smil argues that no advanced economy
can prosper without a strong, innovative manufacturing sector and the jobs it creates.
Reversing a famous information economy dictum, Smil maintains that serving potato chips
is not as good as making microchips. The history of manufacturing in America, Smil tells
us, is a story of nation-building. He explains how manufacturing became a fundamental
force behind America’s economic, strategic, and social dominance. He describes American
manufacturing’s rapid rise at the end of the nineteenth century, its consolidation and
modernization between the two world wars, its role as an enabler of mass consumption
after 1945, and its recent decline. Some economists argue that shipping low-value jobs
overseas matters little because the high-value work remains in the United States. But,
asks Smil, do we want a society that consists of a small population of workers doing
high-value-added work and masses of unemployed? Smil assesses various suggestions for
solving America’s manufacturing crisis, including lowering corporate tax rates, promoting
research and development, and improving public education. Will America act to preserve
and reinvigorate its manufacturing? It is crucial to our social and economic well-being;
but, Smil warns, the odds are no better than even. Written for a general audience
interested in current affairs, this book will also be appealing to students of Science,
Technology, and Society and to readers in business, history, and economics.
Hardcover. 256 pages. August 2013.
OPEN ECONOMY MACROECONOMICS IN DEVELOPING COUNTRIES by Carlos A. Végh
This rigorous and comprehensive textbook develops a basic small open economy model
and shows how it can be extended to answer many important macroeconomic questions
that arise in emerging markets and developing economies, particularly those regarding
monetary, fiscal, and exchange rate issues. Eschewing the complex calibrated models on
which the field of international finance increasingly relies, the book teaches the reader
how to think in terms of simple models and grasp the fundamentals of open economy
macroeconomics. After analyzing the standard intertemporal small open economy
model, the book introduces frictions such as imperfect capital markets, intertemporal
distortions, and non-tradable goods, into the basic model in order to shed light on the
economy’s response to different shocks. The book then introduces money into the model
to analyze the real effects of monetary and exchange rate policy. It then applies these
theoretical tools to a variety of important macroeconomic issues relevant to developing
countries (and, in a world of continuing financial crisis, to industrial countries as well),
including the use of a nominal interest rate as a main policy instrument, the relative
merits of flexible and predetermined exchange rate regimes, and the targeting of “real
anchors.” Finally, the book analyzes in detail specific topics such as inflation stabilization,
“dollarization,” balance of payments crises, and, inspired by recent events, financial
crises. Each chapter includes boxes with relevant empirical evidence and ends with
exercises. The book is suitable for use in graduate courses in development economics,
international finance, and macroeconomics.
Hardcover. 528 pages; 172 figures; 43 tables. August 2013.
AN INTRODUCTION TO ECONOMETRICS: A Self-Contained Approach by Frank Westhoff
This self-contained introduction to econometrics provides undergraduate students with
a command of regression analysis in one semester, enabling them to grasp the empirical
literature and undertake serious quantitative projects of their own. It does not assume
any previous exposure to probability and statistics but covers the concepts in probability
and statistics that are essential for econometrics at the outset. The bulk of the textbook
is devoted to regression analysis, from simple to advanced topics. Students will gain an
intuitive understanding of the mathematical concepts; Java simulations on the book’s
Web site confirm the algebraic equations derived in the text and demonstrate the
important concepts. After presenting the essentials of probability and statistics, the
book covers simple regression analysis, multiple regression analysis, and advanced topics
including heteroskedasticity, autocorrection, measurement error, large sample properties,
instrumental variables, simultaneous equations, panel data, and binary/truncated
dependent variables. Two optional chapters treat additional probability and statistics
topics. Each chapter offers examples, preview problems (bringing students “up to speed”
at the beginning of a chapter), review questions, and exercises. An accompanying Web
site offers students easy access to Java simulations and data sets. After a single semester
spent mastering the material presented in this book, students will be prepared to take
any of the many elective course that use econometric techniques.
Hardcover. 432 pages. September 2013.
LESSONS FROM THE ECONOMICS OF CRIME: What Works to Reduce Offenses? Edited by Philip J.
Cook, Stephen Jonathan Machin, Olivier Marie, and Giovanni Mastrobuoni.
Economists who bring the tools of economic analysis to bear on the study of crime
contribute a normative framework and sophisticated quantitative methods for evaluating
policy; the idea of criminal behavior as rational choice; and the connection of individual
choices to aggregate outcomes. The contributors to this volume, all writing original
contributions, draw on all three of these approaches in their investigations of crime and
crime prevention. Reporting on research in the United States, Europe, and South America,
the chapters discuss such topics as a cost-benefit analysis of additional police hiring; the
testing of innovative policy interventions through field experiments; imprisonment and
recidivism rates; incentives and disincentives for sports hooliganism (“hooliganomics”);
data showing the influence of organized crime on the quality of local politicians; and
the (scant) empirical evidence for the effect of immigration on crime. These chapters
demonstrate the increasingly eclectic approach of economists studying crime as well
as economists’ increasing respect for the contributions of other social scientists in this
area. Scholars and students of the economics of crime and readers in political science,
sociology, and law make up the core audience for this contribution to the CESifo Seminar
Series.
Hardcover. 240 pages. September 2013.
LONELY IDEAS: Can Russia Compete? by Graham Loren
When have you gone into an electronics store, picked up a desirable gadget, and found
that it was labeled “Made in Russia”? Probably never. Loren Graham, a leading scholar
on science and technology, shows that for three centuries Russia has been adept at
developing technical ideas but abysmal at benefitting from them. From the seventeenthcentury arms industry through twentieth-century Nobel-awarded work in lasers, Russia
has failed to sustain its technological inventiveness. Graham identifies a range of
conditions that nurture technological innovation: a society that values inventiveness and
practicality; an economic system that provides investment opportunities; a legal system
that protects intellectual property; a political system that encourages innovation and
success. Graham finds Russia lacking on all counts. He explains that Russia’s failure to
sustain technology, accompanied by recurrent attempts to force modernization, is key to
understanding its political and social evolution and its resistance to democratic principles
in particular. But Graham points to new connections between Western companies and
Russian researchers, new research institutions, a national focus on nanotechnology,
and the establishment of Skolkovo, “a new technology city.” Today, he argues, Russia
has the best chance in its history to break its pattern of technological failure. General
readers interested in current events, in the history of Russia and the Soviet Union, and an
international business audience will be especially interested in Lonely Ideas.
Hardcover. 240 pages. September 2013.
SYSTEMS, NETWORKS, AND INTERDEPENDENCE IN GLOBAL POLITICAL ECONOMY by Hilton L. Root
Liberal internationalism has been the West’s foreign policy agenda since Cold War, and
the liberal West has occupied the top rung of liberal internationalism’s hierarchical
ladder. In this book, Hilton Root argues that the system of international relations has
become a complex ecosystem, no longer hierarchical. The transition from hierarchies
to networked systems is changing every facet of global interaction, and requires a
new language for understanding the process of change. Root proposes the evolutionary
theory of complexity as an analytical framework to explain the unforeseen development
failures, governance trends, and alliance shifts in today’s global political economy. Root
employs systems analysis, in which institutional change and economic development are
understood as self-organizing complexities, to offer an alternative view of institutional
change and persistence. From this perspective, he considers the divergence of East
and West; the emergence of the European state, its contrast with the rise of China,
and the network properties of their respective innovation systems; the trajectory of
democracy in developing regions; and the systemic impact of China on the liberal world
order. Complexity science, Root argues, will not explain historical change processes with
algorithmic precision, but it may offer explanations that match the messy richness of
those processes. In addition to an educated audience interested in macroeconomics,
readers for this title include policymakers and students of international relations,
economic development, and business.
Hardcover. 320 pages. September 2013.
WORKER LEADERSHIP: America’s Secret Weapon in the Battle for Industrial Competitiveness by
Fred Stahl
How can American manufacturing recapture its former dominance in the globalized
industrial economy? In Worker Leadership, Fred Stahl proposes a strategy to boost
enterprise productivity and restore America’s industrial power. Stahl outlines a
revolutionary transformation of industrial culture that offers workers real empowerment
and authority (as well as a monetary share of the savings from productivity gains). Stahl’s
concept of worker productivity reverses the standard formulation--that the happier
people are, the more productive they will be--to assert instead that the more productive
people are, the happier they are with their jobs. Stahl’s Worker Leadership strategy
develops the theory into a concrete approach, with real-world examples. Combining
some of the methods of lean manufacturing made famous by Toyota with genuine worker
empowerment unlike anything at Toyota, Worker Leadership creates highly productive
jobs loaded with responsibility and authority. Workers, Stahl writes, love these jobs
precisely because of the opportunities to be creative and productive. Stahl’s approach
was inspired by changes implemented at John Deere Company’s Harvester Works by
a general manager named Dick Kleine. He also discusses competition with China and
South Korea and tells the story of a factory that GE recently “reshored” from China
to the United States, considers the potential for applying Worker Leadership beyond
manufacturing, provides a brief history of manufacturing;, and even reveals the dark side
of Toyota’s system that opens another opportunity for America. Worker Leadership offers
a blueprint for global competitive advantage that should be read by anyone concerned
about America’s current productivity paralysis.
Hardcover. 256 pages. September 2013.
ECONOMIC THEORY AND MACROECONOMIC PRACTICE: A Non-Technical Primer by Kartik B.
Athreya
There is a great deal of skepticism about the work of macroeconomists. Some believe
that they live in a fantasyland of beautiful mathematical models disconnected from
the real world or that macroeconomists worship at the altar of the free market. Kartik
Athreya wants to set the record straight by providing a non-technical treatment of several
concepts central to macroeconomics. Athreya creates a conceptual foundation through an
examination of the Arrow-Debreu-McKenzie (ADM) model, which suggests that a market
system can be comprehensively analyzed in terms of the neoclassical methodological
premises of individual rationality, market clearing, and rational expectations, using the
two mathematical techniques of convexity and fixed point theory. Athreya continues
with an examination of the relationship among prices, efficiency, and equality and
the ‘Welfare Theorems” that affect their relationship to the ADM model. He continues
with an examination of the process and tradeoffs that have led to the consensus view
of macroeconomic model-building and output. Advanced undergraduate and graduate
students and researchers in macroeconomics, in addition to political scientists and public
policymakers, make up the core audience for this non-technical work.
Hardcover. 464 pages. December 2013.
Download