Periodizing The American Century: Modernism

advertisement
Periodizing the American Century: Modernism, Postmodernism, and Postcolonialism in the Cold
War Context
Ann Douglas
As a cultural historian who has spent her adult life writing about three different decades of American
culture, I have inevitably thought a good deal about periodization. Fredric Jameson has reminded us that,
if repressed, the hunger for such temporal demarcation will always return; it seems integral to the modern
temper. 1 The periods my first two books covered were relatively easy to name. The Feminization of
American Culture was set in the Victorian era (the last historical period in which the United States let
another western nation dominate the naming game); my second, Terrible Honesty: Mongrel Manhattan in
the 1920s, in the modern era. 2 Now that I am working full-time in the last half of the twentieth century,
things have become far more problematic.
I find myself beset by a host of prefixes, all clamoring for my attention: the collaborating and competing
"post's"--as in "postmodern," "postcolonial," "post-Marxist," and "post-Freudian"--as well as a lesser crowd
of "neo's," some grouping edgily with the "post's," as in "neo-Marxist," "neo-Freudian," and "neocolonial,"
and others, like "neoconservative" and "neoliberal," striking out on their own. I mention only in passing the
fashionable prefixes "meta"--as in "metanarrative" and "metahistory"--and "hyper"--as in "hyperreality" and
"hypertext"--because these terms make less strenuous claims for themselves as temporal signifiers. All
these prefixes seem to me self-evasions, however. What precedes us is named, but we are not--prime
instances of the widespread language disorder that [End Page 71] characterizes the contemporary
scene. When it comes to naming ourselves, we seem like planes circling and circling our target, unable to
land. Jameson, Stuart Hall, Andreas Huyssen, and other critics have suggested that we can't not use
terms like postmodern and postcolonial, and they must be right, since I find myself using these terms
here, and elsewhere. 3 We nonetheless might wish to situate our prefix addiction historically.
Managing time or history by naming it has been largely a western and white obsession, a function, in
other words, of power elites stamping their image on the world at large. Even among the white elites of
the West, time-naming as a mainstream and highly interpretative activity is of fairly recent origin. The
American 1920s, the time in which the full-fledged media-consumer culture was born, invented the
decade as a marketing device and a fashion statement--a time unit, in other words, not as the product of
historians' hindsight, but as an advertisement for the present. Earlier eras in the West, as outside it, had
tended to name themselves and their predecessors by quasipolitical or philosophical labels, often
honoring a monarch or a movement along the way: the various dynasties in Chinese history, the
Restoration in Great Britain, the Enlightenment era in Europe, or the Victorian age in Great Britain and the
United States. However, the commercially and cosmetically minded American 1920s called itself the Jazz
Age, the lost decade, and the lost generation, and spoke of the Gay [18]90s and the Feminine [18]50s.
History was commodified into colorful sound bites, a habit the American media continued in following
decades with the Beat Generation, baby boomers, and the Woodstock Generation. 4
None of these terms, of course, had much to do with people of color living in the United States at the
designated times. The 1890s were not "gay" for Southern blacks who saw Reconstruction reforms
replaced by Jim Crow laws and Ku Klux Klan lynchings, or for Asian immigrants struggling to preserve a
precarious economic and legal foothold on the West Coast. The members of the Harlem Renaissance (a
name bestowed on it by the white press in the mid-1920s) hardly thought of themselves as "lost," and it is
questionable if the term "baby boomers" had much resonance for the hundreds of thousands of Puerto
Ricans immigrating to New York City in the decades just after World War II. 5
Today's media have saddled us with the term "Generation X" for the young, and, recognizing a kindred
enterprise, they have adopted wholesale the term "postmodern." In doing so, they legitimate the dilemma
of the academy, admitting the difficulty in our late capitalist, postcolonial, postmodern age of coming up
with generations or decades, or any names except those, as deconstructionists like to say, under erasure,
terms of deferred rather than active definition. Few of us, apparently, are willing to dig into the Pandora's
box of history, throwing out the "post's" and "neo's" in hopes of finding another, better name at the
bottom, one that might no longer feature, even by the tactic of conspicuous absence, ourselves, the
traditional name-givers. "Postmodern," in particular, for all its apparent modesty, its air of "Hey, I'm just
the guy in the rear, out of touch with a quickly-vanishing chain of command," [End Page 72] suggests
what I call the post-moi syndrome: it bespeaks a group intent on keeping nomenclature's power intact,
even if historical authority has moved elsewhere--into the mysterious, undemarcated lands of the not-me.
Before the twentieth century was a decade old, W. E. B. DuBois, for one, seemed quite clear on what to
call it; this was the era of the "color line," the "race" era. In his later, explicitly Marxist years, he settled on
various forms of the terms "colonialism" and "capitalism" to describe the advancing century. I find
DuBois's terms more satisfactory than any of today's "posts"; they have the advantage of directly naming
an antagonist or a problem. DuBois's remark about the color line ("the problem of the twentieth century is
the problem of the color line") is much cited these days, but it has never been treated as a contender by
the official judges of the naming sweepstakes. 6
Aside from their highly privileged status, what do the name-givers, today's "post"-speak group, have in
common? As is often noted, they are usually literary and cultural critics, with a sprinkling of
anthropologists, rather than historians, economists, or political scientists; yet it is the latter groups who
have traditionally been most concerned with issues of periodization. This latest version of the conflicted
dialogue between history and literature, itself a kind of perpetual coitus interruptus, is a civil war;
historians and literary scholars critique themselves as well as each other. Historians, joined by literary
critics like Masao Miyoshi, Edward Said, and Terry Eagleton, have recently taken various aim at those
working in the fields of New Historicism, postcolonialism, postmodernism, multiculturalism, and cultural
studies, accusing them of poaching on the historian's terrain. These historians and literary critics see the
scholars in the newer fields as raiders of the ark, making off with a few, odd, glittering historical facts and
textual artifacts, but not stopping to do the hard work necessary to understand the context from which
they are snatched, distorting or misreading history for their own purposes. 7
When historians speak to me about literary critics working in their own period, people who are often their
friends and whose brilliance they wholeheartedly praise, almost invariably they make remarks like, "She
does no real research," "He's a charlatan," "He doesn't keep up with scholarship." Perhaps because it is
more embarrassing to be caught without the facts than to be accused of having an inadequate
interpretation of them, or perhaps because a number of literary critics really are as uninterested and
inexpert in "hard" history as some historians think, critics have been less aggressive in faulting the
historians. Nonetheless, there is a strong sense among literary people, again shared by a number of
historians like Hayden White and Dominick LaCapra, that historians are conservative laggards illequipped for in-depth cultural analysis who subscribe to old-fashioned mimetic "metanarratives" replete
with cause-and-effect "effects" geared to the interests of dominant groups. 8
It would be folly, I think, to deny the merits of some of the charges on both sides. A quick look at almost
any major American university suggests that the politics of literature departments are usually further to
the Left and their curriculum more radically inclusive than those of their peer history departments.
Relatively few historians have mastered the close reading skills developed by literary critics, skills I
believe [End Page 73] fundamental to historical investigation. 9 On the other side, historians like Natalie
Davis, Joyce Appleby, Margaret Jacobs, and Lynn Hunt speak with authority when they remind us that as
long as life is chronological, in other words, as long as we all die (Hemingway once remarked that all
stories, if they are any good, end in death), history will in some way honor narrative and even
metanarrative. While no narrative can any longer in good faith count itself altogether objective or expect
to corner the market on truth, all inquiry raises questions of fact, and responsible, extensive research can
at least cut down on inaccurate or imprecise reportage. I must confess that postmodern and cultural
studies critics occasionally seem to be less finding their way in the dark than refusing to turn on the light-for light there is, and if it is faint, it is all the more precious.
In my own efforts to periodize the later twentieth century, I have found that literary critics and historians
are collaborating more than they are quarreling, though they do not always see it that way. There is a
massive difficulty inherent in trying to periodize the era of late capitalism, a difficulty predicated on its
global reach, and fully acknowledged by even the fiercest critics of postmodern and postcolonial studies
like the geographer David Harvey and the historian Arif Dirlik. It is to our credit that we recognize this
difficulty, as most of our predecessors, busy handing out sweeping titles like "modern" and "Victorian" as
trophies reserved exclusively for themselves, did not. Modern, or for that matter, postmodern or
postcolonial, we ask, for whom, when, and where? We know that there are more things in heaven and
earth than are dreamt of in your philosophy. Whoever you may be, if you have a philosophy, you have a
politics, an ideology, one with self-interest written all over it, whatever else it may contain.
The two most important "post's," the postmodern and postcolonial, both of which make sense only in
reference to late western capitalism and its expansionist and hegemonic tendencies, are terms that
advertise, while not always acknowledging, their own points of view. Postmodern usually refers to white
Euro-American sites and modes of modernity and modernization; postcolonial means the less developed
countries of the Third World (itself a term I will continue to use as long as poverty, debt, and malaria
demarcate its boundaries so precisely) and its populations of color whom the European powers colonized
and exploited. 10 Not surprisingly, the early theorists of postmodernism, among them, Jean-François
Lyotard, Jean Baudrillard, Jameson, and Huyssen, were largely white, First World, and male, while the
early theorists of postcolonialism like Edward Said, Gayatri Spivak, Stuart Hall, and Homi Bhabha tended
to be persons of color and/or to have Third-World allegiances. Few in either camp are happy with the
term under which they are grouped, although their work has helped to define and popularize it.
The third important term used to describe the second half of the twentieth century, the "cold war," is
familiar to both historians and literary critics. Only historians, however, make it central to their periodizing
attempts, and it is precisely there that cultural critics have the most to learn. I do not mean that we should
officially adopt yet another name for the second half of the twentieth century, but rather that exploring
[End Page 74] this term might help us to understand better the period we are naming, and more
important, living in. The term "cold war," too, has a point of view, and its authorship is precisely known.
Largely a U.S. term, it was used to designate, if not initiate, the struggle the foreign policy elite saw
between the so-called free world of capitalism headed by the U.S. and the closed-market regimes of
international communism represented by the Soviet Union and its satellites, as each superpower
struggled to best the other in the nuclear arms race and carve out impregnable spheres of influence and
power for itself. 11 The supposed end of the cold war in the late 1980s, a titular triumph for the forces of
western democracy, and the subsequent opening of some Soviet archives have unleashed a wave of
self-congratulatory history from liberal and conservative scholars alike. They depict the United States as a
stalwart champion of liberty saving the world from a demonic Soviet regime, a regime now seen, not
inaccurately, as having been primarily occupied in piling up domestic atrocities on a scale that
overshadows even the Holocaust. 12 Few historians, however, deny that the cold war served American
ends better than Soviet ones; the Soviet Union's collapse and the United States's new title as the world's
last superpower admit no other interpretation. Postmodernity, an avowedly U.S. phenomenon, and even
postcoloniality, whose sources are largely elsewhere, are both roughly coterminous with the cold war, and
even inexplicable outside the context it supplies.
The indispensable material conditions of postmodernity are U.S. global supremacy and the means by
which it was accomplished--atom bombs, television, computers, and the ever more powerful and intrusive
multi- and transnational corporations. All of these phenomena owe their existence, or their increased
economic strength and powers of self-justification, to World War II; the last year of the war is also the
moment in which a number of historians see the cold war--a war not with the Nazis but with the Soviet
Union, the U.S.'s wartime ally--taking shape. 13
Espionage, with its Pynchonesque, hyperfictive scenario of agents, double agents, secret strategic
weapons, forged documents, and operational feints reached new heights during World War II. 14 The
United States, long protected by its relative geographic isolation, had been a laggard in the intelligence
field, but in the war years it caught up with its European and Soviet rivals, buying up the best spies in the
business, the Nazi operatives collecting intelligence about the Soviet Union. These men's recruitment was
integral to the launch of the wartime Office of Strategic Services (OSS) and its cold war successor, the
Central Intelligence Agency (CIA), established in 1947. 15 Postmodernism involves a sometimes willful
flirtation with the unknowable, even a calculated extension of it, and the CIA, in conjunction with the State
and Defense Departments, matched the powers of strategic obfuscation since displayed by its ally, the
transnational corporation. The doctrine of plausible deniability, promulgated by President Truman's
National Security Council in 1948 and central to postmodernity, was a way of suppressing evidence and
destroying knowledge. In 1960, as the CIA began planning the assassination of Fidel Castro, President
Eisenhower reminded his advisers, "Our hand should not show"; "Everyone must be prepared to say he
has not heard of it" (GS, 496). To this day, even after the opening [End Page 75] of a number of
government files, historians quarrel about whether Eisenhower actually ordered or tacitly condoned the
assassination of Patrice Lumumba, the leader of Congolese independence, and whether John F.
Kennedy was fully briefed as he took office in 1961 on the assassination plots against Lumumba and
Castro. 16
The compartmentalization of knowledge and the classification of top-secret materials initiated by the
Manhattan Project and rapidly extended to other strategic government activities meant a palpable,
unprecedented shift in the nature and image of information itself, a shift in the ratio between information
that was available to everyone, like the Gross National Product or public political rhetoric; information that
was known by everyone to be available only to a few, like nuclear information; and secrets, like the CIA
assassination plots known only to a handful of men. The widening division between censored and
uncensored knowledge released a flood of hyperinterpretation, of which New Criticism's ascendancy in
the academy was but one manifestation. William Friedman, the Chief Cryptographer of the War
Department, who broke the Japanese military code "Purple" during World War II, found his first job in the
employment of an eccentric millionaire by searching for a cipher that would prove that Francis Bacon
really wrote Shakespeare's plays. NSC 68, the top-secret document adopted by President Truman's
National Security Council in 1950 and not declassified until the 1970s, contained the plan for the steppedup militarization of the cold war that would be used to justify the Korean and Vietnam wars. NSC 68
became, in the words of one historian, the "most famous unread paper of its era," intensely interpreted in
the Soviet Union as well as the United States. 17 The unseen document was teased out of the events it
engendered, then used to read them. Figuring out what you don't see by what you do see, or almost see,
calculating events by what could, but doesn't, happen, was integral to the the early nuclear era's doctrine
of (over) preparedness.
The very name "cold war" raises interpretative questions. As historian Anders Stephanson has pointed
out, it is metaphoric, strangely narrativeless, and atemporal--as the terms "postmodern" and
"postcolonial," for instance, are not. It implies a war that is not really a war, a war that cannot involve open
conflict for those actually participating in it, a war that makes no gesture toward victory or defeat, much
less peace, war's traditional aim, as its outcome and goal. 18 In turn, the term "cold war" spawned a host
of equally oxymoronic phrases like "dual hegemony," "limited nuclear war," "peace-keeping missiles," and
"win the peace." The policy makers who authored NSC 68 described their aim in psychological and
epistemological terms as well as economic and military ones. They wanted, in their words, to convince
"the American people and all free peoples that the cold war is in fact a real war"; as Secretary of State
Dean Acheson put it, it was necessary to "bludgeon the mass mind" with something "clearer than truth." 19
The extreme skepticism about the possibility of disinterested knowledge and language that
postmodernism sponsors may be less universal than it sometimes seems to claim; the postmodern
critique of power makes the most sense when taken as a straightforward description of the extremes of
official dishonesty characteristic of the cold war era. In 1953, in yet another top [End Page 76] secret
document, a group of defense advisers urged Eisenhower to adopt "a national program of deception and
concealment from the American public." 20 The U-2 fiasco of 1960, in which Eisenhower initially told
congressional leaders and the American public that the American reconnaissance aircraft shot down over
the Soviet Union was simply a "weather plane" (BM, 218), was evidence of how thoroughly he carried out
the suggestion. 21
I am not suggesting that World War II or the cold war caused postmodernity or that each enterprise did
not have its own agents and agendas; I tend to believe in something like synchronicity when it comes to
sorting out the relations between base and superstructure, between economics, politics, and culture. 22 I
am suggesting, however, that postmodernity can not be fully explained or understood outside of its cold
war context; they group together as one "composition," as Gertrude Stein might say. 23
Postcoloniality, too, belongs, if less comprehensively and less satisfactorily, in the cold war composition.
The struggle for independence in the Third World was fueled by the devastating costs of World War II for
the imperial powers, and the cold war had everything to do with the United States's often lethal
interventions in Third World affairs. Whether or not the United States is, as Gertrude Stein thought, the
world's oldest modern country, it is the only one of the former so-called Great Powers with a recent
colonial past, and it is certainly the world's oldest neocolonial power. Latin America, theoretically freed
from imperial control by the 1820s, had served as its laboratory. 24
Surveying the controversy over the U.S.'s imperial acquisitions in the Pacific at the turn of the twentieth
century, Senator John Foster observed shrewdly, "Whatever difference of opinion may exist among
American citizens respecting the policy of territorial expansion, all seem to be agreed upon the desirability
of commercial expansion. In fact, it has come to be a necessity to find new and enlarged [overseas]
markets for our . . . products." In 1899 the United States officially adopted the Open Door Policy, an
attack on trade barriers everywhere (except, of course, at home); as Woodrow Wilson put it a few years
later, "the doors of nations that are closed must be battered down!" 25 The United States was adept in the
neocolonial arts of informal empire or free trade imperialism, in other words, long before World War II;
neocolonialism was to be the preferred mode of western imperialism over the next half century, and the
United States had a head start. In the late 1940s, the Marshall Plan gave the U.S. the bargaining chip it
needed to open the markets controlled by Europe, including those of its soon-to-be former colonies. 26
An exercise in geographic and psychological displacement, the cold war was fought in its active or "hot"
phases almost exclusively on Third World terrain, in Korea, Guatemala, the Congo, Vietnam, Angola,
Nicaragua, El Salvador, and elsewhere. Africa and Latin America were pointedly excluded from the
benefits of the Marshall Plan, which, like most U.S. cold war foreign and domestic policy, was a method of
showcasing the white West as a model of success while insuring that its success was fed by the more or
less invisible exploitation of Third World labor. As [End Page 77] DuBois remarked, "it comes natural to
us to have great masses of unthought of men, to conceive of society as built upon an unsocial mud sill";
the (former) colonies are "the slums of the world" (DB, 653, 676). 27 Frantz Fanon viewed the Marshall
Plan as a deliberate decoy, a distraction from the needs of the Third World and the reparations the West
clearly owed it. 28
The oil-rich Middle East and the markets of the Far East had long been significant targets for United
States industry, but postwar defense needs, particularly nuclear ones, made parts of sub-Saharan Africa
vital to U.S. "national security"--the phrase was the mantra of the cold war years--as well. 29 Policy
makers knew the U.S. was vulnerable to charges that it valued Africa only for its strategic minerals, and in
fact, the high uranium potential of South Africa and the uranium mines of the Belgian Congo, ranked by
U.S. experts as the richest in the world and the main source of the bombs dropped in Hiroshima and
Nagasaki, go a long way toward explaining U.S. interest in those countries. 30 Finally, the United States
was the main architect, and also the site, of the United Nations, an organization it expected and still
expects to boss, an organization nonetheless vital to Third World self-definition and central to issues of
postcoloniality in the last half century. When the UN was founded in 1945, it had fifty-one member
nations, only six of which were Asian or African; over the next two decades, sixty-five more nations
joined, the vast majority of which were Asian or African peoples newly free of European colonial rule. U.S.
diplomats and policy makers might refer in private to the new UN representatives as "international
juvenile delinquents," responsible for the "psychopathic atmosphere" of the organization, but in public
they had to listen to their voices and count their votes. 31 Unbearably bullied and compromised as the
United Nations has been by the United States, it is still the place that Secretary-General Dag
Hammarskjold claimed in 1960, not for the "big powers," but "for all the others"; it is "their organization,"
he said, adding, "I deeply believe in the wisdom with which they will . . . use it and guide it." 32
I am making large claims for the centrality of the United States in any effort to periodize the twentieth
century. I do so partly out of a predictable desire to extend the only terrain on which I can claim firm
footing, but also because the study of a single nation provides the kind of specific material base one
needs to ground knowledge, a base in this case that forces the scholar irrevocably into the international,
even global, dimension. A nationalism as intense, ambitious, and successful as that of the United States
was almost certain, paradoxically, to undo itself, losing its precise boundaries as its global reach
extended. DuBois said of Belgium in 1921 that the startling size of the Congo, then its colonial
possession, was "destined to make Belgium but a physical fraction of its colonial black self" (DB, 663);
white Belgians, in other words, all powerful as they might seem, were simply a minority in their own
grotesquely distended country. DuBois's observation applies to a neocolonial power like the United States
as well: market conquest and covert infiltration change the conqueror as surely (if less visibly) as do
territorial conquest and open rule. Moreover, as the only Great Power founded and developed as a
modern nation state [End Page 78] by more than one race, as the genocidal conqueror and keeper of a
Native American population, as host to millions of immigrants from Eastern Europe and the Third World,
the United States has "colonial subjects" and citizens with strong links to other national cultures within its
own borders. A long-standing habit of what historian Michael Kammen calls "compound identities" is one
of the more viable justifications for the United States's much vaunted exceptional status. 33
Although the United States has always had an international role and identity, these aspects became far
more pressing and visible in the second half of the twentieth century. I feel compelled to add a personal
note. As someone who takes an inordinately long time to write a book, I tend to fall into protracted states
of identification with the figures and the era that I am studying. A friend once told me that I wasn't so
much a scholar as a Method actor researching her next part. Turning to a new period of American cultural
history feels like a change of self as well as a change of subject. Moving from the 1850s to the 1920s, as I
did in my second book, was manageable, if difficult; after all, the moderns defined themselves by their
revolt against the Victorians, the subject of my first book, and their revolt wonderfully expressed my own
dissatisfactions with the cultural solutions of the earlier era. Turning from the modern to the late capitalist,
postmodern, and postcolonial era, however, has involved vast, often painful changes in my mind-set and
performance, in good part because of the United States's newly expanded global role and the nature of
that role. 34
The United States was a superpower in the 1920s as later; indeed, World War I, which devastated the
other Great Powers, initiated America's superpower status. The U.S. consolidated its lead position in the
1920s not only through its ever-accelerating productivity and investment rates, but through its mass arts,
now for the first time exported globally via the new media of radio, records, movies, and advertising. 35 F.
Scott Fitzgerald, chronicler of the Jazz Age, claimed that London tailors began cutting their suits to the
American physique in the 1920s. "We were the most powerful nation," he boasted. "Who could tell us
anymore what was fashionable and what was fun?" 36 The tone, however, of the triumphant upstart, the
self-conscious cockiness of the innocent newcomer evident in Fitzgerald's words, marks the difference
between the international status and ambitions of the post-World War I and the post-World War II United
States; first-time global preeminence is not the same thing as calculated, extended, and entrenched
global control.
American historians William Appleman Williams, Charles Maier, Michael Hogan, and Thomas McCormick
have sharply modified the traditional picture of the United States in the 1920s as isolationist--that decade,
as Williams put it, was not America's "lost weekend in international affairs" (AD, 110). 37 The same
historians, however, have also firmly established the differences between the United States's international
assumptions and activities in the 1920s, and its global role in the 1940s. Thomas McCormick
characterizes the U.S.'s posture in the 1920s and early 1930s as "semi-internationalism" in contrast to the
full-blown internationalism or globalism of the post-World War II years (HC, 248). The United States had
the most rudimentary of [End Page 79] overseas intelligence agencies in the 1920s. In 1921, Allen
Dulles, later head of the CIA and then a diplomat stationed in Geneva, intercepted a series of revealing
radio messages between Moscow and Bolsheviks elsewhere. Communist Russia was already viewed as
an enemy regime, one not diplomatically recognized by the United States, but the U.S. government cut off
Dulles's work when he asked for more funding, arguing that the cost of hiring two translators, at $1,500 a
year, was prohibitive. In 1953, the American secret service joined Britain in another line-tapping project,
this time in Vienna, intercepting Russian messages. It required 600 tape recorders operating 1,200 hours
a day using 800 reels of tape, with fifty CIA translators, at a cost of $20 million, but the U.S. paid without
protest. At the height of the cold war, the budget of the U.S. intelligence services topped $30 billion a
year. 38
In the 1920s as in the post-World War II years, private American corporations made massive loans to a
ravaged postwar Europe, particularly to Germany, but the government was slower either to guarantee the
loans or to press the political advantages such investments could bring. As the depression testifies, the
United States in the 1920s had no real master plan, no way of insuring that the tactic of European debt
and American credit would not backfire; it had no strategy by which to implement and sustain its own
creed of economic growth at all costs. The Marshall Plan, of course, was just such a strategy, and it
meant that American business and government, now operating in an uneasy but persistent partnership,
were on the international stage to stay. Again, Europeans in the 1920s and 1930s might go crazy over
African American musicians and performers like Louis Armstrong, Duke Ellington, and Florence Mills as
they toured the continent, but the American government did not yet see them as quasi-official instruments
of Americanization, the advance guard of a global market, as it would beginning in the 1940s. The
Monroe Doctrine might still be in effect in the 1920s, but it had not yet become what one historian of the
post-World War II campaign conducted on behalf of American popular culture abroad has called the
"Marilyn Monroe Doctrine," a doctrine as firmly geared to the ideology of anticommunism and economic
expansion as the formal Truman Doctrine of 1947.
Radio Free Europe's propaganda was less important to this campaign than the music it sponsored; jazz
critic Leonard Feather promised that "Hot Jazz" could "melt Joe [Stalin]'s Iron Curtain" and "turn the cold
war into a cool war overnight." A crowd assembled at the American embassy in Athens one day in 1956
to protest U.S. policies returned the next day to cheer the bebop musician Dizzy Gillespie; whatever
Gillespie's own politics, from the perspective of the U.S. government, he was a pawn in their game. 39
The United States had become, as Noam Chomsky likes to insist, an international gangster. One of the
clinical types most hotly discussed by American psychiatrists in the first two decades of the cold war was
that of the "psychopath," an admittedly imprecise term used to designate a person beyond the reach of
society's values, one who failed to internalize the norms of social behavior. Though asymptomatic in
terms of conventional psychosis or even neurosis, the psychopath was believed to be inaccessible [End
Page 80] to conscience, to the learning experience, and to therapy; he apparently met no inner
resistance in the act of uttering and maintaining what the world held to be untruth. 40 The term was
liberally applied by professionals and amateurs in those years to everyone who threatened national
security, ranging from Stalin, Castro, and Lumumba to Billie Holiday, Lester Young, Charlie Parker,
Jackson Pollock, and Jack Kerouac. 41 Cold war-speak, like cold war military activity, was a form of
extreme displacement, language split off from visible reality. 42 To make sense of it, one often needs to
reverse the apparent subject-object designation; it was, in other words, the United States government,
flanked by its multinational corporations and defense intellectuals, who was the psychopath.
Freudian psychoanalysis, the neo-orthodoxy of Reinhold Niebuhr, and the novels of Fyodor Dostoyevsky
(a favorite not only of Richard Wright and Ralph Ellison but also of Whittaker Chambers, the Beats, and
the New York intellectuals clustered around the Partisan Review), all proponents or examples of selfscrutiny and the darkness such scrutiny reveals, were at their peak of popularity in the U.S. in the first two
decades of the cold war. I have come to see this vogue for ideas of sin and self-searching not only as a
means of self-justification--"Since we are all prey to evil impulses, who can fault me?"--but also as a
symptom of a final reluctance among Americans at this time either fully to undertake the task of selfknowledge or to abandon it once and for all. 43
Conscious that their policies were bringing them some strange bedfellows, including ex-Nazi intelligence
agents, and an array of foreign dictators like Fulgencia Batista in Cuba, Sese Seko Mobutu in the Congo,
Hendrik Frensch Verwoerd in South Africa, and Antonio de Oliviera Salazar in Portugal, officials joked
uneasily about having to walk and talk with the Devil to gain their ends. As one CIA man remarked about
recruiting Nazis, "We knew what we were doing. . . . [We were] using any bastard as long as he was anticommunist." The effort, he said, was not to "focus on it." 44 I translate the cultural fascination with the
psychopath as the last, failed stand at sighting an image in the mirror. "I know someone's a psychopath
around here," people seemed to be saying; "it can't be me, so it must be you." The United States had
embarked on what the ever-astute William Burroughs saw as a "suicidal and psychotic" course of
legalized criminality, or to return to the words of Eisenhower's military advisers in 1953, "a national
program of deception and concealment." 45 The doctrine of credibility that led the United States into its illfated ventures in Korea, Cuba, and Vietnam is the tactic of the pathological liar; i.e., "You will believe me
only if I maintain my story against all comers, for it can not be established as truth on any other grounds."
As Mark Twain remarked, the problem with lying is that you have to remember everything you say. 46
Several rather unfashionable views of mine are in play here. First, I believe that there is some correlation
between national policies and individual psyches; to pretend otherwise is to further the work of
postmodern obfuscation. Even when their exact contours are purposefully camouflaged or filed away as
classified information, such policies sooner or later affect every aspect of that society's cultural activity. If
[End Page 81] one remembers that the U.S. government stepped up surveillance of its citizens to
unprecedented levels in the 1940s and early 1950s; that for the first time, it compiled psychological
dossiers on everyone inducted into its military forces (sometimes sharing the information with the ever
expanding FBI); that federal housing agencies were making maps of every neighborhood in the United
States, ranking each according to its racial/ethnic homogeneity, social stability, and earning potential, and
granting federal funds accordingly; that the nation was tightening its drug laws and defining a host of
beliefs and activities, most notably communism and homosexuality, as criminal, even treasonable--with all
this in mind, my view of the influence that government policy had in private lives in this era may seem
more plausible. 47 The cold war administration had decided the personal was the political long before
postmodernists made the discovery.
Second, I actually believe that most people have something like a truth instinct, for lack of a better
phrase, a steering and self-defining device that goes far deeper than conscience or convention. This
motive force is the hope of establishing an accurate and meaningful narrative, an authentic form for the
self and the world it inhabits. Its representation and dynamic are, of course, culturally inflected; the modes
change over time, but increased violations of this truth instinct, whether by individual behavior or national
policy, are felt. 48 As policy, such violations influence all cultural life. In his last years, strung out on drugs
and exhaustion, Neal Cassady, the catalyst first of the Beat Generation and then of Ken Kesey's Merry
Pranksters, sometimes believed he was in a conversation with the Devil. His two favorite lines from the
Bible, a book he knew backwards and forwards, were "God is not mocked" and "As ye sow, so shall ye
reap." 49
When I think of the 1920s, along with the witty, cynical one-liners like "a sucker is born every minute" and
"the first hundred years are the hardest," come lines from Hart Crane, speaking of "the lamb's first
morning," of "a sail flung into April's inmost day"; from Langston Hughes, writing of "tomorrow / Bright
before us / Like a flame." 50 Lines of equal expressive intensity, if of a different emotional valence, come
to my mind from the cold war era as well--Jack Kerouac's pledge to deliver "telepathic shock and
meaning-excitement," his "hope is a word like a snowdrift"; or Robert Hayden's "voyage through death /
To life upon these shores," and his phrase "love's austere and lonely offices." 51 I am also haunted by a
very different line: the message left by Dr. Sidney Gottlieb, a CIA expert in medical warfare who was sent
to the Congo in 1960 to deliver a lethal toxin intended for use on Patrice Lumumba (who was code named
"Stinky") to another CIA operative there--"the virus [is] in the safe." 52 With its lurid, metaphoric hints of
scientifically planned biological contagion and paranoid lock-up, it could not belong to any era but that of
the cold war. There is palpable farce in the line--Can this be real?--but terror as well, because in fact this
was real.
I was born in the 1940s and brought up in the 1950s; this is my era, and the book I am writing about it has
long seemed to me the most exciting of the several projects in which my life has been happily consumed,
the one closest to my own needs and [End Page 82] preoccupations. Yet I do not find in the culture of
the times, brilliant as it often was, the varied rhythms, the freedom or ease of thought so marked, to my
mind, in the American moderns. The artists of the cold war seem to face, if not higher, more
overdetermined odds. The room for expression has shrunk, even as the loss of space intensifies the
performance. Dotson Rader, memorializing his friend Tennessee Williams, one of the greatest talents of
the era, was drawn to these lines by the Russian poet Andrei Voznesenky: "Life is a series of burnt out
sites. / Nobody escapes the bonfire: / If you live, you burn." 53
The great metropolises that sponsored American modernism, New York in particular, but Chicago and
Los Angeles as well, also hosted the most dazzling cultural achievements of the post-World War II
decades. Film noir, the Actor's Studio, Abstract Expressionism, snapshot photography, confessional
poetry, the Beat movement, bop, rhythm and blues, and the black protest narrative tradition were in good
part urban movements and products. The city was being abandoned in these years, however; by
manufacturers, big corporations, federal funding, and the white middle classes fleeing by the millions to
the new suburbs, themselves heavily subsidized by the federal government. Americans' long-standing
war against their own cities reached new intensities in those decades, and the fact of nuclear warfare
gave this war added ammunition. Eisenhower, explaining the rationale for the vast network of
superhighways codifying American travel that were to come out of the Interstate Highway Act of 1956,
said that "In case of atomic attack on the cities, the road net must permit quick evacuation of target
areas." 54 World War II had established the city as the logical object of bomber raids intended to
demoralize and destroy civilians as well as defense factories, and commentators began warning
Americans shortly after Hiroshima and Nagaski that an atom bomb could "wipe out [a city] in thirty
minutes. . . . New York will be a slag heap. . . . Radioactive energy . . . will leave the land [around it]
uninhabitable" for up to five hundred years (BB, 14). Ghastly pictures were drawn and horrifying tales told
of New York and other cities in ruins. 55
Perhaps anticipating such devastation, during his 1946 Manhattan visit, Jean-Paul Sartre called it "The
Great American Desert." 56 Camus, visiting the same year, saw it as a "prodigious funeral pyre at
midnight"; "everybody looks like they stepped out of a B-film," he remarked. 57 At the start of Laura
(1944), an early film noir directed by German émigré Otto Preminger, the villain and narrator of the film, a
Machiavellian gossip columnist called Waldo Lydecker (played by Clifton Webb), says, "It seemed as if I
were the only person left alive in the city." 58 Of course, New York had not been deserted by Puerto
Ricans or blacks or artists of whatever ethnic background. Sartre, Camus, and Waldo Lydecker were
nonetheless expressing a larger point. The city, as it appeared in American film and fiction of the 1920s
and 1930s, whatever its terrors, was a site of modernity, the place where the new was happening. The
city of post-World War II fiction and film, however, is a place that is changing far less rapidly than the
suburban landscapes developing all around it; it has become a museum, a site of nostalgia as well as
innovation. Always a place of mystery and a cue to fearful fantasies among white middle-class
Americans, the city was now enlarging its population [End Page 83] of "unthought of men," in DuBois's
phrase, those people whose names for themselves and their metropolis had little publicity or currency in
the mainstream media culture. The city was changing hands without changing names. The language
disorder that afflicts late capitalist postmodern culture had a point of origin here.
So I return to the periodization of the second half of the twentieth century, the place where I began.
Postcoloniality and postmodernity originated in a common site, in the events and developments of World
War II and of the cold war. In the last two decades, they have produced the academic discourses of
postcolonialism and postmodernism. 59 Postmodernism seems to be in part a denial of postcoloniality,
much as the Marshall Plan bypassed the Third World to showcase the First. Put another way,
postcolonialism has the power to explain and contextualize postmodernist thought, as postmodernism
does not seem fully able to situate or deal with postcolonialism. It can do so for much the same reasons
that during the cold war years the Marxists of the Communist world were often shrewder in their
assessment of the capitalist West than vice versa; for the same reasons that intellectuals of color over the
last fifty years have been more insightful about our era than white western intellectuals. As James
Baldwin put it, "we who have been described so often are now describing [you]." 60 I believe that the
forces of change, if there is to be change, do not lie primarily with the traditional elites of the First World.
I see the half century since World War II as divided into two stages, the first lasting from roughly 1945
through the early 1960s, the second running from the mid-1960s until the present. The latter is the period
Anthony Appiah has labelled "post-optimism," when, in Masao Miyoshi's words, "the return to 'authenticity'
is a closed route," and, to quote Edward Said, "innocence is . . . out of the question." 61 The first
generation of post-World War II artists in both the First and the Third Worlds, faced with the psychotic
behavior and elaborately systematic deceit of the cold war era, were nerved to fresh acts of resistance
and self-expression. These desperately creative acts of heroic subjectivity were attempts at what Jack
Kerouac called "100 percent personal honesty," a romantic reinvention of charisma designed to declassify
every kind of information for revolutionary political and artistic ends. 62 In this earlier period, art and
thought still seem grounded in specific geographical places and historical times; new forms, political and
artistic, are believed to be possible. There is an outside to the system, a place where protest is
meaningful and consequential. As the 1960s and 1970s progressed, however, colonialism gave way not
to independence, but to neocolonialism, and many of the charismatic leaders and intellectuals of the
1950s and 1960s were systematically assassinated, deposed, or died young, sometimes in mysterious
circumstances. I think of Felix Moumié, Patrice Lumumba, Amilcar Cabral, Kwame Nkrumah, Richard
Wright, Frantz Fanon, Che Guevara, Jacobo Arbenz Guzmán, Malcolm X, Martin Luther King, and even
John and Robert Kennedy.
It is hard to imagine or to overestimate the effect of such losses. Independence in Africa is "represented
by certain men," Fanon wrote in his impassioned elegy for his friend Lumumba. 63 We should not be too
quick to deconstruct or disbelieve his statement. [End Page 84] By definition, charisma in its full
Weberian sense is unexpected, tied to particular personalities, something you believe in only when you
see it. To wipe out a generation of charismatic leaders is to destroy a belief system as well, one intimately
connected to the possibility of deliverance, of change. Nor would I wish to underestimate the impact of the
untimely deaths and early dead ends, no matter how self-destructive, of cultural megastars and
innovators like Charlie Parker and Jackson Pollock, or even of lesser figures like James Dean,
Montgomery Clift, Dorothy Dandridge, Jack Kerouac, Marilyn Monroe, Hank Williams, Sam Cooke, and
Elvis Presley. At its best, American popular culture, as C. L. R. James understood, can be a promissory
note to fresh transformations. 64 We are just beginning to count the casualties of the cold war.
To me, the second, "postoptimism" stage of the culture of late capitalism is the less interesting one.
Despite its achievements, the feminist movement chief among them (the early cold war era was a time of
masculine revolt), I see it as the time when many of us lost our way. 65 At its worst, this stage was little
better than a quagmire of intellectual and political compromise, of overinvestment in pastiche and irony, a
cultural moment ashamed of its hopes and defaulting on its dreams. I don't despair, however. The work of
the major intellectuals of the 1940s and 1950s, of DuBois, Wright, Fanon, Mills, Muste, O'Dell, and
James, among others, all of whom wrote in a state of creative tension with various traditions of marxist
thought, go a long way toward lighting the path to the rigorous analysis of our times we need and the
praxis it might engender. I have to believe this.
Ann Douglas is Parr Professor of Comparative Literature at Columbia University. She is the author of
Terrible Honesty: Mongrel Manhattan in the 1920s (1995), and is currently at work on a study of cold war
culture titled If You Live, You Burn.
Notes
1. See Fredric Jameson, Postmodernism, or, The Cultural Logic of Late Capitalism (Durham, N.C.: Duke
University Press, 1991), 3.
2. See my The Feminization of American Culture (New York: Knopf, 1977) and idem, Terrible Honesty:
Mongrel Manhattan in the 1920s (1995; New York: Noonday, 1996).
3. For the term "language disorder," see Fredric Jameson, "Postmodernism and Consumer Society," in
The Anti-Aesthetic: Essays on Postmodern Culture, ed. Hal Foster (Port Townsend, Wash.: Bay Press,
1983), 118. See also my "High Is Low," New York Times Magazine, 29 September 1996, 75-80, for
further discussion of what Richard Wright described as the West's "genius for calling things by [the] wrong
names," a disorder that peaked in the cold war era; (Richard Wright, White Man, Listen! [1957; New York:
Harper Collins, 1995], 57). The colonial period, with its countless undeclared small wars and its talk of
pacification and treaties of protection as a cover for imperial aggression was the major source and
predecessor of the grotesque euphemisms and distracting abstractions that constitute the language
disorders of the cold war, itself a continuation of colonialism by changed means; see William Pietz, "The
Post-Colonialism of Cold War Discourse," Social Text 19/20 (fall 1988): 55-75. The founders of the CIA
were Anglophile devotees of the British empire, raised on the "great game" of Rudyard Kipling's Kim.
British statesmen referred to the empire's colonies as "the jewels in the crown," the CIA alluded to its
agents as "the crown jewels" and its assassination plots as "the family jewels." See Peter Grose,
Gentleman Spy: The Life of Allan Dulles (Boston: Houghton Mifflin, 1994), 87, 148; hereafter abbreviated
GS; and Evan Thomas, The Very Best Men: Four Who Dared: The Early Years of the CIA (New York:
Simon and Schuster, 1994), 231; hereafter abbreviated BM. Moreover, if the cold war is recognized for
what it was, as a continuation under a different name of the centuries-long drive for global control on the
part of the Great Powers, in this case the U.S., it is not yet over. The difference in our "post"-cold war era
is that there is momentarily no covering rationale or tag like "imperialism" or the "cold war" for capitalism's
ongoing drive for world hegemony. The U.S. still gestures toward the cold war slogan of "national
security" to cover its actions, as in its recent attempts to force its partners in the World Trade Organization
into conformity with its ugly economic reprisal against Cuba, but the term seems increasingly empty of
content. I find it fascinating that a number of key cold war terms, like "intervention," have been
appropriated without comment by today's literary theorists.
For the necessity of the terms "postmodern" and "postcolonial," see Jameson, Postmodernism, xxii;
Andreas Huyssen, After the Great Divide: Modernism, Mass Culture, Postmodernism (Bloomington:
Indiana University Press, 1986), 181; and Stuart Hall, "When Was the 'Post-Colonial'? Thinking at the
Limit," in The Post-Colonial Question: Common Skies, Divided Horizons, ed. Iain Chambers and Lidia
Curti (New York: Routledge, 1996), 242-60. For further discussion of the "posts" and their American
origins in New York intellectual circles, see my "The Failure of the New York Intellectuals," Raritan
Review 17, no. 4 (spring 1998): 1-23.
4. I discuss the 1920s commodification of time my Terrible Honesty, 481-85.
5. See Eric Foner, Reconstruction: America's Unfinished Revolution, 1863-1877 (1988; New York: Harper
and Row, 1989); Ronald Takaki, Strangers from a Different Shore: A History of Asian Americans (1989;
New York: Penguin, 1990); and Virginia E. Sánchez Korrol, From Colonia to Community: The History of
Puerto Ricans in New York City, rev. ed. (Berkeley: University of California Press, 1994).
6. W. E. B. DuBois, The Souls of Black Folk (1903) in Three Negro Classics, ed. John Hope Franklin
(New York: Avon, 1965), 209. For DuBois's later terminology, see W. E. B. DuBois: A Reader, ed. David
Levering Lewis (New York: Henry Holt, 1995), 83-95, 642-92, 755-800; hereafter abbreviated DB. My
chief source for late capitalism is Ernest Mandel, Late Capitalism, trans. Joris De Bres (New York: Verso,
1978).
7. See Joyce Appleby, Lynn Hunt, and Margaret Jacob, Telling the Truth About History (New York: W. W.
Norton, 1994); Melvyn P. Leffler, "New Approaches, Old Interpretations, and Prospective
Reconfigurations," Diplomatic History 19 (spring 1995): 173-98; Masao Miyoshi, "A Borderless World:
From Colonialism to Transnationalism and the Decline of the Nation State," Critical Inquiry 19 (summer
1993): 726-51; and Edward Said, Culture and Imperialism (New York: Knopf, 1993), 303. Anders
Stephanson is one of the younger historians who combines literary deconstructionist and postmodern
notions of textuality and the historical subject with solid archival research. He writes that his literary
colleagues seem to lead "a comparatively charmed life," since, unlike historians, they do not "spend a lot
of time finding . . . sources," nor feel an obligation "to write something readable" (Anders Stephanson,
Kennan and the Art of Foreign Policy [Cambridge, Mass.: Harvard Unversity Press, 1989], viii, ix). I am
immeasurably indebted to Stephanson, a Columbia colleague, who was kind enough to guide my
research and critique this essay, though its shortcomings are, of course, my own.
For a defense of historical narrative and in-depth research, see Natalie Zemon Davis, "Who Owns
History?" Perspectives: AHA Newsletter 34 (November 1996): 1, 4-6; for both a moving essay on the
historian's ethos and an implicit critique of postmodernism, see Caroline Bynum, "Wonder," American
Historical Review, 102 (February-March, 1997): 1-26. The most important historicist-minded attacks on
postcolonialism and postmodernism are, respectively, Arif Dirlik, "The Post-Colonial Aura: Third World
Criticism in the Age of Global Capitalism," Critical Inquiry 20 (winter 1994): 328-56; and David Harvey,
The Condition of Postmodernity: An Inquiry into the Origins of Cultural Change (Cambridge, Mass.:
Blackwell, 1990). See also Ella Shohat, "Notes on the Post-Colonial," Social Text 10 (summer 1992): 99113; The Postmodern History Reader, ed. Keith Jenkins (New York: Routledge, 1997), an anthology of
critiques on postmodernism by historians; and Lutz Niethammer, Posthistoire: Has History Come to an
End? trans. Patrick Camiller (New York: Verso, 1992), a broader analysis of the various end-of-history
schools whose thinking underlies the prefix mania.
8. For an excellent summary of the White-LaCapra critique of historians, see Lloyd S. Kramer, "Literature,
Criticism, and Historical Imagination: The Literary Challenge of Hayden White and Dominick LaCapra,"
The New Cultural History, ed. Lynn Hunt (Berkeley: University of California Press, 1989), 97-128.
Historians have themselves critiqued and historicized the assumptions of objectivity that governed earlier
historical research; see Peter Novick, That Noble Dream: The 'Objectivity Question' and the American
Historical Profession (New York: Cambridge University Press, 1988). For an historical account of
postmodernity whose critical stance mirrors postmodern theory, see Michael Geyer and Charles Bright,
"World History in a Global Age," American Historical Review 100 (October 1995): 1034-60.
9. For examples of cultural historians attempting literary methods of reading (in this case, of film) but
falling short, see the otherwise excellent essays by Michael Rogin, "Kiss Me Deadly: Communism,
Motherhood, and Cold War Movies," Representations 6 (spring 1984): 1-36; Emily S. Rosenberg,
"'Foreign Affairs' After World War II: Connecting Sexual and International Politics," Diplomatic History 18
(winter 1994): 59-70; and Warren Susman, "Did Success Spoil the United States? Dual Representations
in Postwar America," in Recasting America: Culture and Politics in the Age of Cold War, ed. Lary May
(Chicago: University of Chicago Press, 1989), 19-37.
10. See Nicolas D. Kristof, "Malaria Makes a Comeback," New York Times, 1 January 1997, 1, A10 (with
a global map of malaria's empire that is strikingly coincidental with the traditional boundaries of the Third
World); and "Technology for Everyman," New York Times, 20 January 1997, D6 (with a global map of
regular Internet users that largely conforms to the usual boundaries of the First World). I take Internet use
as an index of postmodernity; see Sherry Turkle, Life on the Screen: Identity in the Age of the Internet
(New York: Simon and Schuster, 1996), which makes and documents the same assumption.
11. The best histories of the cold war are Walter LaFeber, America, Russia, and the Cold War, 19451992, 7th ed. (New York: McGraw-Hill, 1993); Melvyn P. Leffler, A Preponderance of Power: National
Security, The Truman Administration, and the Cold War (Stanford, Calif.: Stanford University Press,
1992); Thomas J. McCormick, America's Half-Century: United States Foreign Policy in the Cold War
(Baltimore: The Johns Hopkins University Press, 1989); hereafter abbreviated HC; and Martin Walker,
The Cold War: A History (New York: Henry Holt, 1993). Not always factually reliable but indispensable is
Noam Chomsky, "The Cold War Reconsidered," in World Orders Old and New (New York: Columbia
University Press, 1996), 26-74. See also Jean Baudrillard, "The Precession of Simulacra," in Simulacra
and Simulation, trans. Sheila Faria Glaser (Ann Arbor: University of Michigan Press, 1994), 32-40, for an
impressionistic, brilliant discussion of the meaning for contemporary culture of cold war strategies like
deterrence. For an analysis of current cold war historiography, see Anders Stephanson, "The United
States," in The Origins of the Cold War in Europe: International Perspectives, ed. David Reynolds (New
Haven: Yale University Press, 1994), 23-52. Stephanson stresses that the early cold war was defined by
the U.S.'s decision not to negotiate with the Soviet Union. Russia (and Stalin) were seen as too
deceptive, barbaric, and monstrous, off the scale of nations and humanity, so to speak, to permit
diplomatic exchange. By cold war logic, the unknowable can only be contained, never penetrated, an
attitude that was itself a precursor to postmodernism proper. See also Howard Jones and Randall P.
Woods, "Origins of the Cold War in Europe and the Near East: Recent Historiography and the National
Security Imperative," Diplomatic History 17 (spring 1993): 251-76, for another overview of scholarship.
12. For a summary of post-cold war historiography about the former Soviet Union, see Steven Merritt
Miner, "Revelations, Secrets, Gossip, and Lies: Sifting Warily through the Soviet Archives," New York
Times Book Review, 14 May 1995, 19-21. For the best of the American self-congratulations by a
distinguished historian, see John Lewis Gaddis, We Now Know: Rethinking Cold War History (New York:
Oxford University Press, 1997). Such self-congratulation is possible only when the record of the United
States's activities in the Third World, and the Third World itself, are ignored, as Gaddis pointedly does in
The Long Peace: Inquiries Into the History of the Cold War (New York: Oxford University Press, 1987),
and "Intelligence, Espionage, and Cold War Origins," Diplomatic History 13 (winter 1989): 191-212,
despite the concentration of CIA forces there. For a corrective view of the cold war and the Third World,
see Gabriel Kolko, Confronting the Third World: United States Foreign Policy, 1945-1980 (New York:
Pantheon, 1988); Brenda Gayle Plummer, Rising Wind: Black Americans and U.S. Foreign Affairs, 19351960 (Chapel Hill: University of North Carolina Press, 1996); Immanuel Wallerstein, "African Unity
Reassessed," Africa Report 11 (April 1966): 41-46; and idem, "Africa, the United States, and the World
Economy," in U.S. Policy Toward Africa, ed. Frederick S. Arkhurst (New York: Praeger, 1975), 11-37.
13. For an exhaustively documented analysis of the shift among U.S. policy makers toward cold war aims
in 1945, see Melvyn P. Leffler, "The American Conception of National Security and the Beginnings of the
Cold War, 1945-48," American Historical Review 89 (April 1984): 346-81, 391-400; see also HC, 33-42.
The Marshall Plan of 1948, by which Europe was to be restored to industrial prosperity largely at the
Soviet Union's expense, was first sketched out by the U.S. ambassador to the USSR, Averell Harriman, in
early 1945, when that nation was still very much a wartime ally; see Walter Isaacson and Evan Thomas,
The Wise Men: Six Friends and the World They Made: Acheson, Bohlen, Harriman, Kennan, Lovett,
McCloy (New York: Simon and Schuster, 1986), 248. The FBI security checks run on scientist J. Robert
Oppenheimer, working on the atomic bomb at Los Alamos in 1942 and 1943, were directed towards his
possible Russian contacts; see Peter Goodchild, J. Robert Oppenheimer: Shatterer of Worlds (1980; New
York: Fromm International, 1985), 86f. American observers like A. J. Muste and Raymond Gram Swing
saw that the atomic bombs dropped on Hiroshima and Nagasaki were, as Swing put it, "in effect dropped
on the Russians," not the Japanese, as a way of insuring the U.S.'s supremacy in the postwar settlement
and in postwar Asia in particular (Raymond Gram Swing, quoted in Paul Boyer, By the Bomb's Early
Light: American Thought and Culture at the Dawn of the Atomic Age [1985; Chapel Hill: University of
North Carolina Press, 1994], 192; hereafter abbreviated BB). See also A. J. Muste, Not by Might:
Christianity, The Way to Human Decency (New York: Harper and Row, 1947). One might also note that
the naming problems of the late capitalist, cold war era began in World War II; President Roosevelt
solicited names more individualized and colorful than "World War II," but none were forthcoming; thus the
most lethal and extensive war of the twentieth century became by name a kind of clone of its
predecessor, a preparation for the "post" mentality.
For the rise of multinationals, see Mira Wilkins, The Maturing of Multinational Enterprise (Cambridge,
Mass.: Harvard University Press, 1974), and David C. Korten, When Corporations Rule the World (West
Hartford, Conn.: Kumarian Press, 1996). C. Wright Mills, The Power Elite (New York: Oxford University
Press, 1956) is still a seminal study of the rise of the U.S. corporation and the "interlocking directorates"
(8) of its ruling class. For nuclear weapons and for television, see Margot A. Henrikson, Dr. Strangelove's
America: Society and Culture in the Atomic Age (Berkeley: University of California Press, 1997); Joyce
Nelson, The Perfect Machine: Television and the Bomb (Philadelphia: New Society Publishers, 1992);
Richard Rhodes, The Making of the Atom Bomb (New York: Simon and Schuster, 1986); and Raymond
Williams, Television: Technology and Cultural Form (New York: Schocken Books, 1975). For the
development and meaning of computers, see Martin Campbell-Kelly and William Aspray, Computer: A
History of the Information Machine (New York: Basic Books, 1996); and Computers in the Human
Context, ed. Tom Forester (Cambridge, Mass.: MIT Press, 1989), especially the essays by Langdon
Winner, Stephen S. Cohen, and John Zysman.
14. The Spanish double agent Garbo, a spy for the Allies, invented fourteen fictional agents and eleven
well-placed contacts, all of whom had their own detailed biographies and corresponded with his putative
German bosses in their own literary style; in 1944 he received both the Order of the British Empire and
the Iron Cross from Germany. Garbo is a particularly Pychonesque instance; see Frances Russell, The
Secret War (Alexandria, Va.: Time-Life, 1987), 28-29. See ibid, 195-97, for the story of the fake Operation
Lifeboat, a feint complete with dummy photos and weapons to distract Hitler from the actual Operation
Overload that liberated Europe in 1944. For breaking "Purple," see Ronald Lewin, The American Magic:
Codes, Ciphers, and the Defeat of Japan (New York: Farrar, Straus, and Giroux, 1982).
15. For the history of the CIA, see Christopher Andrew, For the President's Eyes Only: Secret Intelligence
and the American Presidency from Washington to Bush (New York: Harper Collins, 1995); Rhodri
Jeffreys, The CIA and American Democracy (New Haven: Yale University Press, 1989); Grose,
Gentleman Spy; and Thomas, The Very Best Men. For a general, multinational history of espionage,
technological and human (and the growing preponderance of the former), see Jeffrey T. Richelson, A
Century of Spies: Intelligence in the Twentieth Century (New York: Oxford University Press, 1995). My
point throughout this essay is not that the United States was unique in the lengths to which it carried such
foreign espionage and domestic surveillance, though it undeniably had the most resources and money to
devote to them in the post-World War II years. Plausible deniability in one form or another has been part
of politics for as long as there have been ruling classes. Stalin's regime, of course, also destroyed
evidence and had its own language of concealment; Soviet initiatives were routinely described as
"responses to aggression" (Edvard Radzinsky, Stalin, trans. H. T. Willetts [New York: Doubleday, 1996],
446, see also 103, 322, 354, 465, and 532); see also Christopher Andrew and Oleg Gordievsky, KGB:
The Inside Story (New York: Harper Collins, 1990). The European powers, too, conducted extensive
espionage operations; see Douglas Porch, The French Secret Service (New York: Farrar, Straus and
Giroux, 1995); and Christopher Andrew, Her Majesty's Secret Service (New York: Penguin, 1987).
My point is rather that this was a qualitative intensification of tactics, if not a departure, for the United
States, an introjection of enemy methods, one never acknowledged, and highly influential. For CIA/U.S.
ties to Nazi Germany, see Joyce Nelson, The Perfect Machine; Grose, The Gentleman Spy (the New
York law firm of Allen and John Foster Dulles specialized in German business clients); Thomas, The Very
Best Men, Christopher Simpson, The Splendid Blonde Beast: Money, Law, and Genocide in the
Twentieth Century (Monroe, Maine: Common Courage Press, 1995); and Martin A. Lee, The Beast
Reawakens (Boston: Little, Brown, 1997). The fascination among U.S. policy makers with the lessons
Hitler had to teach cold war America is too extensive to document here, but see Dwight D. Eisenhower
uneasily speculating that "we could lick the whole world if we were willing to adopt the system of Adolph
Hitler" (quoted in H. W. Brands, "The Age of Vulnerability," American Historical Review 94 [October
1989]: 970). The Catholic moralist William V. O'Brien commenting in 1967 on the tragic effect of World
War II on the United States, wrote, "the most demonic success of Hitler was his ability to Hitlerize his
enemies" (BB, 215).
16. For the Castro and Lumumba plots, see BM, 170f, 220-35, 386-88, and 396-98; Richard D. Mahoney,
JFK: Ordeal in Africa (New York: Oxford University Press, 1983), 59-72; hereafter abbreviated JFK.
Seymour Hersh, The Dark Side of Camelot (Boston: Little, Brown, 1997), 188f., makes a persuasive case
that Kennedy knew about them early and supported them strongly.
17. Gaddis Smith, quoted in American Cold War Strategy: Interpreting NSC 68, ed. Ernest R. May (New
York: St. Martin's Press, 1993), 15. For speculation and response to the document in the USSR, see
Vladislav Zubok, "Zubok's Commentary," in ibid., 192-93. It is suggestive that May urges the historian to
read and re-read NSC 68, "much as a literary critic goes back to read a novel or a poem" (17). The shift in
the nature and availability of information is a precondition for the kind of language disorder I am
describing; plausible deniability mandates linguistic obfuscation. See Tobin Siebers, Cold War Criticism
and the Politics of Skepticism (New York: Oxford University Press, 1993), for a suggestive analysis of
New Criticism and its successors, deconstruction and postmodern theory, as an extension of cold war
ideology and tactics. Alan Nadel, Containment Culture: American Narratives, Postmodernism, and the
Atomic Age (Durham, N.C.: Duke University Press, 1995), is also useful in this context. For an influential
theorization of the shifting status of knowledge in the cold war era, see Jean-François Lyotard, The
Postmodern Condition: A Report on Knowledge, trans. Geoff Bennington and Brian Massumi
(Minneapolis: University of Minnesota Press, 1984).
18. See Anders Stephanson, "Fourteen Notes on the Very Concept of the Cold War."
19. American Cold War Strategy, 81; Dean Acheson, quoted in Chomsky, "The Cold War Reconsidered,"
27.
20. Brands, "The Age of Vulnerability," 968.
21. For the U-2 incident, see Michael Beschloss, May Day: Eisenhower, Krushchev, and the U-2 Affair
(New York: Harper and Row, 1986).
22. For a seminal discussion of these relations that utilizes Antonio Gramsci's theory of hegemony to
foreground cultural activity while retaining a stress on economic factors and class as in some sense
determinative, see Raymond Williams, "Base and Superstructure in Marxist Cultural Theory," New Left
Review 82 (October-December 1973): 3-16.
23. See Gertrude Stein, "Composition as Explanation," in Selected Writings of Gertrude Stein, ed. Carl
Van Vechten (1946; New York: Vintage Books, 1972), 511-523.
24. I think the U.S. must be seen in a limited but real sense as postcolonial; see Bill Ashcroft, Gareth
Griffiths, and Helen Tiffin, The Empire Writes Back: Theory and Practice in Post-Colonial Literatures (New
York: Routledge, 1989), esp. 1, 2, and 160-65; and Stuart Hall, "When Was the Post-Colonial?" 246.
African Americans, as Richard Wright long ago claimed, are indeed colonial subjects; see Conversations
with Richard Wright, ed. Kenneth Kinnamon and Michael Fabre (Jackson: University of Mississippi Press,
1993), 125. See also J. H. O'Dell, "Colonialism and the Negro American Experience," Freedomways 6
(fall 1966): 296-308; and Gayatri Chakravorty Spivak, "Teaching for the Times," The Decolonization of
Imagination, ed. Jan Nederveen Pieterse and Bhikhu Parekh (London: Zed Books, 1995), 187-89. The
U.S.'s far more influential neocolonial rule has rightly overshadowed, even obliterated, considerations of
its postcolonial status; see Ella Shohat, "Notes on the Post-Colonial;" and Anne McClintock, "The Angel
of Progress: Pitfalls of the Term 'Post-colonial,'" in Colonial Discourse and Post-Colonial Theory: A
Reader, ed. Patrick Williams and Linda Chrisman (New York: Columbia University Press, 1994), 300-302.
Jean Genet, a partisan of the Black Panthers and the Palestinians, and himself an early major
postcolonial theorist, told Le Monde in 1970 that his art was "[no longer] gratuitous. Today it is in the
service of a cause. It is against America" (Jean Genet, quoted in Edmund White, Genet: A Biography
[New York: Knopf, 1993], 540). For an overview of Latin America, see Eduardo Galeano, Open Veins of
Latin America: Five Centuries of the Pillage of a Continent, trans. Cedric Belfrage (New York: Monthly
Review Press, 1973). U.S. intervention has been most visible in Central America; see Walter LaFeber,
Inevitable Revolutions: The United States in Central America, rev. ed. (New York: Norton, 1993).
25. John Foster and Woodrow Wilson, quoted in William Appleman Williams, The Tragedy of American
Diplomacy, 3d ed. (New York: W. W. Norton, 1972), 51, 72; hereafter abbreviated AD; this is still the
single most important book on twentieth-century U.S. foreign policy and its economic motivation; it is the
work of a tormented prophet as well as a great historian.
26. On the Marshall Plan, see Michael J. Hogan, "American Marshall Planners and the Search for a
European Neocapitalism," American Historical Review 90 (February 1985): 44-72, and idem, The
Marshall Plan: America, Britain, and the Reconstruction of Western Europe, 1947-1952 (New York:
Cambridge University Press, 1987). Charles S. Maier, "The Politics of Productivity: Foundations of
American International Economic Policy After World War Two," International Opinion 31 (1977): 607-33,
sees the Marshall Plan as a specific answer to and denial of Marxist economic doctrine.
27. For a discussion of U.S. expertise in showcasing, see J. H. O'Dell, "Foundations of Racism in
American Life," Freedomways 4 (fall 1964): 518-35; hereafter abbreviated "FR." The Point Four Policy
officially announced U.S. aid to the Third World in 1949, but it was always the stepchild of the Marshall
Plan enterprise, used largely to wring further rights for the U.S. to Third World resources, markets, and
labor; see Kolko, Confronting the Third World, 42-43.
28. See Frantz Fanon, The Wretched of the Earth, trans. Constance Farrington (New York: Grove, 1968),
80-82, 95-106.
29. For a perceptive discussion of the history of the phrase "national security," see Emily S. Rosenberg,
"The Cold War and the Discourse of National Security," Diplomatic History 17 (spring 1993): 277-84.
30. See Thomas Borstelmann, Apartheid's Reluctant Uncle: The United States and Southern Africa in the
Early Cold War (New York: Oxford University Press, 1993), 180; hereafter abbreviated AR; and David N.
Gibbs, The Political Economy of Third World Intervention: Mines, Money, and U.S. Policy in the Congo
Crisis (Chicago: University of Chicago Press, 1991).
31. Secretary of State Dean Acheson and Assistant Secretary of State for UN Affairs, John Dickerson,
respectively, quoted in Thomas J. Noer, Cold War and Black Liberation: The United States and White
Rule in Africa, 1948-1968 (Columbia: University of Missouri Press, 1985), 229, and AR, 173. For the
history of the UN, see Stanley Meisler, United Nations: The First Fifty Years (New York: Atlantic Monthly
Press, 1995).
32. Dag Hammarskjold, quoted in Mahoney, JFK, 96; see also Brian Urquhart, Hammarskjold (New York:
W. W. Norton, 1972).
33. Michael Kammen, "The Problem of American Exceptionalism: A Reconsideration," American
Quarterly 45 (March 1993): 30; see also Cornell West, "The Postmodern Crisis of the Black Intellectuals,"
in Cultural Studies, ed. Lawrence Grossberg, Cary Nelson, and Paula Treichler (New York: Routledge,
1992), 689-705, for ideas on the distinctiveness of American culture. For histories of Native Americans
and immigrants, see Francis Jennings, The Founders of America (New York: Norton, 1993); John Bodnar,
The Transplanted: A History of Immigrants in Urban America (Bloomington: Indiana University, 1985); and
Ronald Takaki, A Different Mirror: A History of Multicultural America (Boston: Little, Brown, 1993).
34. The cultural continuities between the 1920s and the post-World War II decades, between, in other
words, the modern, the postmodern, and even the postcolonial styles and eras in American cultural
history, are easily discernible. Postmodernism, that largely American dialogue with modernity, not
surprisingly has its clearest antecedents in American rather than European modernism. At least initially,
modern American artists like Langston Hughes, James Weldon Johnson, Edna St. Vincent Millay, and
Dorothy Parker were practitioners, or victims, of what critics of postmodernism today call culturalism, a
failure of political precision and praxis, an overevaluation of the importance of cultural as opposed to
economic or political factors as determinants of political change. Like their postmodern heirs, moderns
like Parker, Ethel Waters, Countee Cullen, Ernest Hemingway, and Wallace Thurman were resolute
antisentimentalists and anti-essentialists, firm believers in the performative, constructed self, and experts
at deconstructing through multiple tactics all claims to the contrary. The media were as central to Fats
Waller's or F. Scott Fitzgerald's enterprises as to that of today's postmodernists, and they transgressed
the then heavily policed boundaries between high and low art, elite and mass culture, with an insouciance
and skill seldom matched by their successors. They, too, saw culture as mongrel and hybrid; in fact,
urban American modernism, unlike European modernism and American postmodernism, was itself visibly
multi-ethnic and biracial, an "Arrangment in Black and White," as Dorothy Parker put it, the result of the
collaboration and conflict between African American, Euro-American, and immigrant American talents
(The Portable Dorothy Parker, ed. Brenden Gill, rev. ed. [New York: Viking Press, 1973], 19-23). If
postcolonialism involves the recognition of the racial, ethnic, and gendered other, the earlier generation,
particularly its African American members, was already in some sense postcolonial as well as
postmodern, as their late-twentieth-century postmodern heirs are not. This is another of the disjunctions
between the earlier and later periods that trouble me. For the 1920s, see my Terrible Honesty.
35. On the U.S. emergence as a superpower with World War I, see Paul Kennedy, The Rise and Fall of
the Great Powers: Economic Change and Military Conflict from 1500 to 2000 (New York: Random House,
1987), 194-333. For American commerce and mass culture exports in the 1920s, see Frank Castigliola,
Awkward Dominion: American Political, Economic, and Cultural Relations with Europe, 1919-1933
(Ithaca, N.Y.: Cornell University Press, 1984); and Emily S. Rosenberg, Spreading the American Dream:
American Economic and Cultural Expansion, 1890-1945 (New York: McGraw-Hill, 1980).
36. F. Scott Fitzgerald, "Echoes of the Jazz Age," in The Crack-Up (1945; New York: New Directions,
1956), 14.
37. See also V. S. Kiernan, America: The New Imperialism: From White Settlement to World Hegemony
(London: Zed Press, 1978), 162-226; and Charles S. Maier, "The Two Postwar Eras and the Conditions
for Stability in Twentieth Century Europe," American Historical Review 86 (April 1981): 327-52.
38. For Allen Dulles tapping the Russians in 1921, see GS, 78-79. For the 1953 operation, see BM, 128.
For the peak Cold war budget of U.S. intelligence, see "The Dossier on Anthony Lake," New York Times,
17 January 1997, A30.
39. See Reinhold Wagnleitner, "The Irony of American Culture Abroad: Austria and the Cold War," in
Recasting America, ed. Lary May, 285-301. I do not mean to suggest that American popular art, despite
government backing and manipulation, did not carry its own subversive and/or liberating messages at
home and abroad, for it undoubtedly did. See Rob Nixon, Homeland, Harlem, Hollywood: South African
Culture and the World Beyond (New York: Routledge, 1994); and Miriam Hansen, "The Mass Production
of the Senses: Classical Hollywood Cinema, Reflexive Modernization, and Popular Modernism," (paper
presented at "Modern Culture and Modernity Today" conference, Brown University, Providence, R.I., 1415 March 1997). For discussion of the duality of purpose and motive in American mass culture, see Jean
Baudrillard, America, trans. Chris Turner (New York: Verso, 1988), 88-91; Baudrillard says that "we shall
never resolve the enigma of the relation between the negative foundations of greatness and that
greatness itself. America is powerful and original; America is violent and abominable" (88). O'Dell
analyzes a similar dichotomy between American rhetoric and practice on all levels in "Foundations of
Racism"; as a Marxist, he sees the former (free labor) as a kind of showcasing, a decoy for the latter
(slave labor), yet he himself was a die-hard Frank Sinatra fan; see Taylor Branch, Parting the Waters:
America in the King Years (New York: Simon and Schuster, 1988), 573-75. American popular art routinely
slips the federal/national noose, yet the fact that that noose tightened considerably after World War II
could not be without consequence for the artists involved, as the blacklisting and heightened FBI
surveillance of artists and entertainers in the 1940s and 1950s testifies; see Larry Ceplair and Steven
Englund, The Inquisition in Hollywood: Politics in the Film Community, 1930-1960 (1979; Berkeley:
University of California Press, 1983); J. Fred MacDonald, Television and the Red Menace: The Video
Road to Vietnam (New York: Praeger, 1985); Natalie Robins, Alien Ink: The FBI's War on Freedom of
Expression (New Brunswick, N.J.: Rutgers University Press, 1993); and Herbert Mitgang, Dangerous
Dossiers: Exposing the Secret War Against America's Greatest Authors (New York: Donald I. Fine, 1988).
Jules Dassin's and Richard Wright's careers, for example, were incalculably influenced by government
surveillance and interference; the denial of cabaret cards to Charlie Parker and Billie Holiday (because of
tightened drug laws) is another key instance. See Michel Fabre, The Unfinished Quest of Richard Wright,
trans. Isabel Barzun (New York: William Morrow, 1975).
40. For influential definitions and discussions of the psychopath by doctors and psychiatrists of the cold
war era, see Hervey M. Cleckley, The Mask of Sanity, 3d ed., (St. Louis: C. V. Mosby, 1955); idem,
"Psychopathic States," American Handbook of Psychiatry, vol. 1 (New York: Basic Books, 1959), 567-87;
Horace B. English and Champrey English, A Comprehensive Guide to Psychological and Psychoanalytic
Terms (New York: Longmans, 1958); G. M. Gilbert, The Psychology of Dictatorship (New York: Ronald
Press, 1950); Robert M. Lindner, Rebel Without a Cause: The Hypoanalysis of a Criminal Psychopath
(New York: Grune and Stratton, 1944), idem, Prescription for Rebellion (New York: Rinehart, 1952), and
idem, The Fifty Minute Hour: A Collection of True Psychoanalytic Tales (New York: Rinehart, 1955).
These experts suggested that the psychopath was culturally conditioned, more apt to turn up in some
places and times than others, and agreed that the United States in the postwar decades was unusually
fertile in psychopathy, though they did not analyze the specific historical connections or causes.
Cleckley's emphasis on the psychopath as an instance of linguistic aphasia or "semantic personality
disorder," able to speak fluently and coherently but emotionally so cut off from what he says that "speech
in this disorder, however well formulated, has no meaning and is not language at all" (Cleckley, American
Handbook, 585), seems crucial to understanding the disorders of cold war language. Serious discussion
of the psychopath began with OSS attempts to analyze Hitler. Henry A. Murray, who testified for Alger
Hiss in 1949 that Whitaker Chambers was a psychopath, wrote one of the (unpublished) OSS reports on
the Nazi leader; see Allen Weinstein, Perjury: The Hiss-Chambers Case, rev. ed. (New York: Random
House, 1997), 437-40. See also Walter L. Langer, The Mind of Adolf Hitler: The Secret Wartime Reports
(1943) (New York: Basic Books, 1972). Lindner, in Rebel Without a Cause, includes several studies of
Hitler in the bibliography although he never mentions him in the text. The psychopath was usually viewed
as male in psychoanalytic circles, symptomatic of a widespread masculine crisis of identity, but popular
culture told another story. See William March, The Bad Seed (New York: Rinehart, 1954), a novel that
quickly became a Broadway play and a movie, for a self-conscious portrait of a psychopathic girl. In the
novel, her father works for a steamship line in Latin America, and in the Maxwell Anderson play, he works
in the Pentagon's atomic energy program. Film noirs like Double Indemnity (1944) and Out of the Past
(1947) routinely featured female psychopaths.
In the mid-1960s, the psychopath label was dropped by psychiatrists, though it continues to find favor
with the popular press to this day; the term "borderline," with its heavier investment in apolitical
narcissism, replaced it. See Otto Kernberg, "Borderline Personality Organization," Journal of the
American Psychiatric Association 15 (1967): 641-85; and Roy R. Grinker, Beatrice Werble, and Robert C.
Drye, The Borderline Syndrome (New York: Basic Books, 1968). Hitler was now labeled by
psychoanalysts a borderline personality of the narcissist, paranoid type; see Norbert Bromberg and Verna
Volz Small, Hitler's Psychopathology (New York: International Universities, 1983). This shift, and the
decline in prestige of psychoanalysis in general beginning in the mid-1960s, seem to me a significant
marker of the transition from early postmodernism, as I define it in this essay, to full-blown
postmodernism; psychoanalysis became part of cosmetology, as much a style as an inquiry. On
psychoanalysis's decline, see Nathan G. Hale, Jr., The Rise and Crisis of Psychoanalysis in America:
Freud and the Americans, 1917-1985, vol. 2 (New York: Oxford University Press, 1995), 300-44.
41. James Burnham describes Lumumba as a "small time megalomaniac crook" and "a bold, psychotic
adventurer," and calls for his assassination ("At the Crack of Krushchev's Whip," National Review, 5
November 1960, 272); see also Mahoney, JFK, 38-39. For the diagnosis of Lester Young as a
psychopath, see Luc Delannoy, Pres: The Story of Lester Young, trans. Elena B. Odio (Fayetteville:
University of Arkansas Press, 1993), 134-48; for Charlie Parker, see Ross Russell, Bird Lives! The High
Life and Hard Times of Charlie "Yardbird" Parker (1976; London: Quartet Books, 1988), 219; for Billie
Holiday, see Donald Clarke, Wishing on the Moon: The Life and Times of Billie Holiday (New York: Viking,
1994), 304-5. For Pollock, see Steven Neifeh and Gregory White Smith, Jackson Pollock: An American
Sage (New York: Clarkson N. Potter, 1989), 745. For Kerouac (or in this case, his work), see David
Dempsey, review of Dr. Sax: Faust Part Three, by Jack Kerouac, New York Times Book Review, 3 May
1959, 28-29.
42. For cold war language, see Carol Cohn, "Sex and Death in the Rational World of Defense
Intellectuals," Signs 12 (spring 1987): 687-718. She notes that there is no word for peace, except
"strategic instability" in this language; since the point of view is that of the weapons, human casualties are
described as "collateral damage."
43. The psychoanalytic work I have found most useful in understanding this state of denial, particularly in
U.S. policy makers and CIA members, is Robert Jay Lifton, The Nazi Doctors: Medical Killing and the
Psychology of Genocide (New York: Basic Books, 1986), esp. 159-60, 445-7, 500-1. Lifton finds the Nazi
doctors he interviewed examples of pseudopsychopathy rather than themselves psychopathic in the
sense that Hitler was; in Lifton's analysis, where the psychopath splits the self off from reality, the
pseudopsychopath doubles it. The psychopath knows no law other than his own and has no conscience
in the conventional sense of the word; the pseudopsychopath does, but is able to block it in order to be
"part of a larger institutional structure which encourages, even demands it" (423). He needs to believe, as
Lifton puts it, "Anything I do on planet Auschwitz doesn't count on planet Earth" (447); he needs to explain
his behavior, by notions of Jews as the carriers of disease, or, as the CIA leaders did, by notions of
national security. I find parallels in the behavior and manner of the doctors Lifton interviewed and the
analysis of CIA leaders by family members and biographers; see Thomas's interview with CIA director
Richard Bissell (BM, 338-40). Both Lifton and Thomas remark upon the denial of guilt, and the
remoteness of the men they interview as they discuss their most reprehensible activities. Yet the suicides
of former Secretary of the Navy James Forrestal and of CIA insider Frank Wisner, the retreat of Sidney
Gottlieb into medical missionary work in leper colonies, and the deepening depression of CIA leaders
Tracy Barnes and Desmond Fitzpatrick imply that the denial was less successful in the Americans' case.
Before his death, Barnes became an advocate for black Civil Rights. See BM, 135, 198, 320-21, 335-6;
Isaacson and Thomas, Six Wise Men, 468-70; and Townsend Hooper and Douglas Brinkley, Driven
Patriot: The Life and Times of James Forrestal (New York: Knopf, 1992), 387-478. For a critique of the
reification of evil as a front for conservative policies by a lifelong pacifist radical, see A. J. Muste,
"Theology of Despair: An Open Letter to Reinhold Niebuhr," in The Essays of A. J. Muste, ed. Nat Hentoff
(New York: Bobbs-Merrill, 1967), 302-7. For Muste's career, see Nat Hentoff, Peace Agitator: The Story
of A. J. Muste (New York: Macmillan, 1963).
44. Harry Rositzke, quoted in BM, 35. F. D. R. explained the wartime alliance with Stalin by quoting a
Balkan proverb: "It is permitted in time of grave danger to walk with the devil until you have crossed the
bridge" (AR, 203). Allen Dulles said that "an intelligence officer should be free to talk to the devil himself if
he could gain any useful knowledge [thereby]" (GS, 233). Truman, justifying using the atom bomb on
Japan, explained, "When you deal with a beast, you have to treat him as a beast" (LaFeber, America,
Russia, and the Cold War, 26). Of course, Stalin quoted similar maxims, citing Sergei Nechaev, The
Revolutionary's Catechism, "make use of the Devil himself if the revolution requires it" (cited in
Radzinsky, Stalin, 97).
45. The Letters of William S. Burroughs 1945-59, ed. Oliver Harris (New York: Viking, 1993), 364.
46. On the doctrine of credibility and its psychological implications, see Robert J. McMahon, "Credibility
and World Power: Exploring the Psychological Dimension in Postwar American Diplomacy," Diplomatic
History 15 (fall 1991): 458-71. For useful studies of the psychology and politics of lying, see Sissela Bok,
Lying: Moral Choice in Public and Private Life (1978; New York: Vintage, 1989); and David Wise, The
Politics of Lying: Government, Deception, Secrecy, and Power (New York: Random House, 1973).
47. For the suburbs and the housing maps, see the indispensable Kenneth T. Jackson, Crabgrass
Frontier: The Suburbanization of the United States (New York: Oxford University Press, 1985); on the
screening program in World War II, see Rebecca Schwartz Greene, "The Role of the Psychiatrist in World
War II" (Ph.D. diss., Columbia University, 1977). On Hoover and the FBI, see Curt Gentry, J. Edgar
Hoover, The Man and the Secrets (New York: W. W. Norton, 1991); Athan G. Theoharis and John Stuart
Cox, The Boss: J. Edgar Hoover and the Great American Inquisition (1988; New York: Bantam, 1990);
and Richard Gid Powers, Secrecy and Power: The Life of J. Edgar Hoover (New York: Free Press, 1987).
On sexual and political repression during the cold war, see Estelle B. Freedman, "'Uncontrolled Desires':
The Response to the Sexual Psychopath," Journal of American History 74 (June 1987): 83-106; George
Lipsitz, Class and Culture in Cold War America: 'A Rainbow at Midnight' (South Hadley, Mass.: J. F.
Berger, 1982); Geoffrey S. Smith, "National Security and Personal Isolation: Sex, Gender, and Disease in
the Cold War United States," International History Review 14 (May 1992): 307-37; Benjamin Welles,
Sumner Welles: FDR's Global Strategist (New York: St. Martin's Press, 1997) (Sumner Welles was forced
out of FDR's administration in 1943 because of his homosexual activities); and Stephen Whitfield, The
Culture of the Cold War (Baltimore: The Johns Hopkins University Press, 1991).
48. For such shifts, see Lionel Trilling, Sincerity and Authenticity (1972; New York: Harcourt, Brace, and
Jovanovich, 1980). Charles Taylor, "The Politics of Recognition," in Multiculturalism: Examining the
Politics of Recognition, ed. Amy Gutman (Princeton, N. J.: Princeton University Press, 1994), 25-73, and
Anthony Appiah, "Identity, Authenticity, Survival: Multicultural Societies and Social Reproduction," in ibid.,
149-63, build on Trilling's analysis. On narrative dysfunction, the inability to tell our own stories, see also
Charles Baxter, "Now There's No Who There," New York Times, 29 March 1997, 19; and C. K. Williams,
"Admiration of Form: Reflections on Poetry and the Novel," American Poetry Review 24 (January 1995):
13-23. The need to satisfy the denied truth instinct in order to rebuild a life, or a nation, has been evident
in widely diverse cultures in the last decade. South Africa's Truth and Reconciliation Commission's
attempt to address the crimes of the Apartheid era, Argentina's new willingness to explore its past
complicity with German Nazis by the release of hitherto secret files and the establishment of an
international truth commission, the Japanese government's apology to the "comfort women" of World War
II, Canadian reassessment of Aboriginal claims, and the partial redress offered by the U.S. government to
Japanese-Americans interned during World War II and to the victims of the Tuskegee and radiation
experiments--this testifies to a broad belief, in the words of the South African Alex Boraine, that only truth
can begin the healing process after an era of "deceit, lies and coverups" (Alex Boraine, quoted in Ellis
Cose, "Forgive and Forget?" Newsweek, 21 April 1997, 45). See also Anthony DePalma, "Canadian
Court Ruling Broadens Indian Land Claims," New York Times, 12 December 1997, A3; and Calvin Sims,
"Argentina Dispersing the Nazi Cloud," New York Times, 19 April 1997, 6. This process is, of course,
subject to abuse; see Deborah Sontag, "Too Busy Apologizing To Be Sorry," New York Times Week in
Review, 29 June 1997, 3; Seth Stevenson, "Apologies," www.slate.com, 11 July 1997, 7-8; and Patricia J.
Williams, "Apologia Qua Amnesia," Nation, 14 July 1997, 10.
49. William Plummer, The Holy Goof: A Biography of Neal Cassady (1981; New York: Paragon House,
1990), 156.
50. Hart Crane, quoted in John Unterecker, Voyager: A Life of Hart Crane (New York: Farrar, Straus, and
Giroux, 1969), 10; The Complete Poems of Hart Crane, ed. Brom Weker (Garden City, N.Y.: Anchor
Books, 1966), 40; Langston Hughes, quoted in my Terrible Honesty, 89.
51. The Portable Jack Kerouac, ed. Ann Charters (New York: Viking, 1995), 484, 322; Robert Hayden,
Collected Poems, ed. Frederick Glaysher (New York: Liveright Publishing Corporation, 1985), 54, 41.
52. Sydney Gottlieb quoted in BM, 224. For African American suspicions related to Gottlieb's role in the
Congo, see Patricia Turner, I Heard It Through the Grapevine: Rumor in African-American Culture
(Berkeley: University of California Press, 1993), 112.
53. Andrei Voznesenky, quoted in Dotson Rader, Tennessee: Cry of the Heart (New York: Doubleday,
1985), 337.
54. Dwight D. Eisenhower, quoted in Jackson, Crabgrass Frontier, 249.
55. For other descriptions of the city as atomic target or wasteland, see BB, 20, 67-8, 78-9, 91, 143, 145,
148, 152, 170, 175-6, 189, 213, 226, 237, 239, 248, 269, 278, 281-2, 287, 306, 311-12, 319-28, 364.
56. Jean-Paul Sartre, "Manhattan: The Great American Desert," in The Empire City, ed. Alexander Klein
(New York: Rinehart, 1955), 455-57.
57. Albert Camus, American Journals, trans. Hugh Levick (New York: Marlowe and Company, 1987), 39,
32.
58. Quoted in David Reid and Jayne L. Walker, "Strange Pursuit: Cornell Woolrich and the Abandoned
City of the Forties," Shades of Noir, ed. Joan Copjec (New York: Verso, 1993), 70-71.
59. On the reasons for the appearance of the disciplines of postmodernism and postcolonialism in the
academy in the 1970s and 1980s, see Terry Eagleton, The Illusions of Postmodernism (Oxford:
Blackwell, 1996), and Arif Dirlik, "The Post-Colonial Aura." Eagleton sees the origins of postmodernism in
the hasty disillusionment on the part of the radicals of the 1960s with the possibilities of political change.
Dirlik dates postcolonial studies to the arrival of Third World intellectuals at top First World universities.
Both offer illuminating readings of the two groups' opportunistic motives and self-evasions, though neither
questions the importance of much postmodern and postcolonial criticism. For comparisons of
postmodernism and postcolonialism, both as terms and as disciplines, see Anthony Appiah, "Is the Postin Postmodernism the Post- in Postcolonialism?" Critical Inquiry 17 (winter 1991): 336-57; Homi K.
Bhabha, "The Postcolonial and the Postmodern: The Question of Agency," in his The Location of Culture
(New York: Routledge, 1994), 171-97; Diana Brydon, "The White Innuit Speaks: Contamination as
Literary Strategy," in Past the Last Post, ed. Ian Adam and Helen Tiffin (Calgary, Alberta: University of
Calgary Press, 1990), 191-204; Simon During, "Postmodernism or Post-Colonialism Today," Textual
Practice 1 (spring 1987): 32-47; Linda Hutcheon, "Circling the Downspout of Empire: Post-Colonialism
and Postmodernism," in Past the Last Post, ed. Adam and Tiffin, 167-90; and Helen Tiffin, "PostColonialism, Postmodernism, and the Rehabilitation of Post-Colonial History," Journal of Commonwealth
Literature 23 (August 1988): 169-81. All these writers share to one degree or another my own sense that
postcolonial theory [End Page 85] has a deeper commitment to political resistance and historical analysis
than postmodernism. For an authoritative periodizing of the twentieth century, see Eric Hobsbawm, The
Age of Extremes: A History of the World, 1914-1991 (1994; New York: Vintage, 1996).
60. James Baldwin, quoted in "FR", 535. For the excitement of marxist thought for some critics of the cold
war era, see C. Wright Mills, The Marxists (New York: Dell, 1962); idem, Listen, Yankee: The Revolution
in Cuba (New York: Ballantine, 1960); and AD, 277-93. Richard Wright devoted the later part of his career
to a serious consideration of why, at the particular juncture of history in which he found himself, the
smartest men of color were smarter than the smartest white men, and why he, as a black intellectual,
found himself ahead of the West; see Wright, White Man, Listen!, 111-42; idem, The Color Curtain: A
Report on the Bandung Conference (1956; Jackson: University Press of Mississippi, 1994); idem, Black
Power: A Record of Reactions in a Land of Pathos (1954; New York: Harper Collins, 1995); and
especially idem, The Outsider (1952; New York: Harper Collins, 1993), his fictional portrait of the black
intellectual as the leader of the avant-garde. Wright's thinking on this subject is so sensible and so
athwart western assumptions that few understood his project until it was described as "an extended affair
in intercultural hermeneutics" (Paul Gilroy, The Black Atlantic: Modernity and Double Consciousness
[1993; Cambridge, Mass.: Harvard University Press, 1995], 150). Gilroy believes that Wright offers a way
out of the polarized opposition between "Eurocentrism and black nationalism" (186).
61. Anthony Appiah, "Is the Post- in Postmodernism the Post- in Postcolonialism?" 353; Miyoshi, "A
Borderless World?" 747; Edward Said, "Representing the Colonized: Anthropology's Interlocutors,"
Critical Inquiry 15 (autumn 1989): 213. Jack Kerouac, always an uncannily acute, historically minded
observer, anticipated this analysis, dating the shift to full or late postmodernism (though he did not call it
that) to 1962 and the advent of the phrase "You're putting me on." (Jack Kerouac, Vanity of Duluoz: An
Adventurous Education, 1935-46 [1968; New York: Penguin, 1994], 13). He took the shift toward lying as
accepted common practice to be the death knell of his own artistic ethos. See my "Telepathic Shock and
Meaning-Excitement: Kerouac's Poetics of Intimacy" (forthcoming, Kerouac Quarterly, [fall 1998]); idem,
"Dharma Bum," The Nation, 12 May 1997, 57-62; and idem, "Remembering Allen Ginsberg", The Village
Voice, 15 April 1997, 36.
62. For Kerouac on "100 percent personal honesty," see his Selected Letters, 1940-1956, ed. Ann
Charters (New York: Viking, 1995), 356. As the borderline diagnosis replaced the category of psychopath;
as the all-out expressiveness of the Actor's Studio Method gave way to a more comedically inflected
performance ethos; as Miles Davis's "cool" style succeeded Charlie Parker's "hot" sound; as John
Ashbery's more impersonal poetry displaced in prestige the confessions of John Berryman, Theodore
Roethke, and Anne Sexton; as Pop Art eclipsed the tortured Abstract Expressionism of Jackson Pollock;
so the shift from the early postmodern temper to the full-blown postmodern style in Beat circles was
marked by the new ascendency of Ken Kesey's Merry Pranksters, with their computer hackers and high-
tech tactics, marked disinterest in people of color (blacks, who tended to resist the acid trip, were
considered "pathetic and square"), and "who cares?" ethos (Tom Wolfe, The Electric Kool-Aid Acid Test
[1968; New York: Bantam Books, 1981], 213, 249). Ginsberg, Burroughs, and Cassady all had ties with
Kesey's group, but Kerouac--of the Beats, the least capable of moving with the times, the least
sympathetic to the later postmodern ethos--hated them. See Wolfe, The Electric Kool-Aid Acid Test,
especially 23, 80-1, 124-5, 212-13, 319, and 326; and Gerald Nicosia, Memory Babe: A Critical Biography
of Jack Kerouac (New York: Grove Press, 1985), 653-54. bell hooks has written of the discordance
between the black historical situation and the postmodern aesthetic in "Postmodern Blackness," Colonial
Discourse and Post-Colonial Theory, ed. Patrick Williams, 421-28. The eclipse of the Civil Rights
movement by the antiwar protest and the hippies in the late 1960s is another sign of the transition from
early to later postmodernism, as is the shift from hard to soft rock, from the Who and the Rolling Stones to
Carly Simon and Cat Stevens, by the early 1970s.
63. Frantz Fanon, "Lumumba's Death: Could We Do Otherwise?" in Toward the African Revolution, trans.
Haakon Chevalier (1967; New York: Grove Press, 1988), 197. The translator Chevalier, fittingly, was a
Berkeley professor of Romance languages whose career was derailed by the security wars surrounding
the development of nuclear weapons; see Goodchild, Robert J. Oppenheimer, 98, 99, and 182-3. Like
"psychopath," "charisma" became one of the buzzwords of the cold war era; Max Weber, now used as an
antidote to Marx, came into wide favor. For Weber on charisma, see From Max Weber: Essays in
Sociology, trans. H. H. Gerth and C. Wright Mills (1946; New York: Oxford University Press, 1958), 245359; and Max Weber, The Theory of Social and Economic Organization, trans. A. M. Henderson and
Talcott Parsons (Glencoe, Ill.: Free Press, 1947), 358-423. For contemporary assessments by cold war
intellectuals, see Arthur Schlesinger, Jr., "On Heroic Leadership," Encounter 15 (December 1960): 3-11;
and Edward Shils, "Charisma, Order, and Status," American Sociological Review 30 (April 1965): 199213. For accounts of charismatic leadership, see Jean Lacouture, The Demigods: Charismatic
Leadership in the Third World (New York: Knopf, 1970); and Ann Ruth Willner, The Spellbinders:
Charismatic Political Leadership (New Haven: Yale University Press, 1984). Tobin Siebers argues in Cold
War Criticism that New Criticism and its descendants were efforts to contain charismatic language.
Recently, a number of citizens in Third World countries, faced with Laurent Kabila in the Congo, for
instance, and his attempted return to the rhetoric of the revolution, have expressed a disenchantment with
charismatic leaders, a preference for a more technocratic and procedural style of management; see Larry
Rohter, "Caribbean Politics, American-Style," New York Times Week in Review, 27 July 1997, 3; and
Howard W. French, "Kabila Reaches Congo's Capital," New York Times, 21 May 1997, A3.
64. For James on popular culture, see The C. L. R. James Reader, ed. Anna Grimshaw (Cambridge,
Mass.: Blackwell, 1992), 151-52, 247-54; and idem, American Civilization (Cambridge, Mass.: Blackwell,
1993), 118-65.
65. On the masculine revolt, see Barbara Ehrenreich, The Hearts of Men: American Dreams and the
Flight from Commitment (Garden City, N. Y.: Anchor Press, 1983). Arguing that male defection from
family obligations in the 1950s preceded and in part inspired feminists' protests against their domestic
role, Ehrenreich supplies a mid-twentieth-century American historical context for Eve Kosofsky
Sedgewick's theory of homosocial bonding as theorized in her Between Men: English Literature and Male
Homosocial Desire (New York: Columbia University Press, 1985). In this first phase of cold war culture,
as artists like Elvis Presley borrowed feminine vulnerability, emotional expressiveness and finery for male
performance, men preempted women even as sex symbols; the feminine psyche and image was
colonized for masculine development and display; see Leslie Fiedler, "The New Mutants," Partisan
Review 4 (fall 1965): 505-15. Never before or since in American cultural life has so little support or
credence been given to the independent female intellect. Researching this period sometimes feels like
coming upon a terrain that has been sprayed for women, as woods and backyards are sprayed for
insects. The long-term consequence was the women's movement, but the immediate result, as Shirley
Jackson, Carson McCullers, and Flannery O'Connor were quick to dramatize on another plane, was
mutation. Fashion witnessed a return via Christian Dior and other designers to a grotesquely misshapen
feminine form, pinched and pronged by stiletto heels, severely cinched at the waist and handicapped with
hobbled or sheath skirts and all-points uplift bras; see Elizabeth Ewing, History of Twentieth Century
Fashion, rev. ed. (Lanham, Md.: Barnes and Noble, 1992); Caroline Reynolds Millbank, New York
Fashion: The Evolution of American Style (New York: Harry N. Abrams, 1989); and Valerie Steele, Fifty
Years of Fashion: New Look to Now (New Haven: Yale University Press, 1997). For Dior's gayness, see
Marie-France Pochna, Christian Dior: The Man Who Made the World Look New, trans. Joanna Saville
(New York: Arcade, 1996). The woman of the 1950s, armed for battle with no battle in sight, was a
walking paradigm of deterrence strategy.
Fittingly, the key sexual issue of the day was less feminine desire, a preoccupation of earlier decades,
than male homosexuality. For the first time, homosexuality itself became a recognized, if phobic, third
gender position, one depicted as waiting to penetrate and preempt its two rivals; see Paul Welch, "The
Gay World Takes to the Streets," Life, 26 June 1964, 66-82. Militant queers, headed by Jean Genet and
William Burroughs, openly declared male homosexuality a calculated political "refusal," in Genet's words,
"to continue [a] world" whose crimes had rendered it unworthy of allegiance. Genet claimed that effete
homosexual fashions and mannerisms expressed no nostalgia for the feminine, but rather a "bitter need
to mock virility"; for Genet, the homosexual could be defined as a man for whom "the entire female sex . .
. doesn't exist" (Jean Genet, quoted in Edmund White, Genet, 384, 385, 170). (Burroughs says in his
letters that "women have poison juices" [190]; "every U.S. bitch of them want . . . [a] man all to herself,
with no pernicious friends hanging about"[205]--femininity is nothing but a conspiracy against male
bonding). Arthur Miller, analyzing the new fascination of 1950s theater with the sensitive, adolescent,
male rebel--the androgynous, bisexual, or homosexual Montgomery Clift, Tony Perkins, Marlon Brando,
and James Dean, offered cases in point. Miller says that they question "the right of society to renew itself
when it is, in fact, unworthy" (Arthur Miller, "The Shadows of the Gods," American Playwrights on Drama,
ed. Horst Frenz [New York: Hill and Wang, 1965], 149); Norman Mailer, "A Review of Jean Genet's The
Blacks," The Presidential Papers (1963; New York: Berkley Medallion, 1970), 199-211, defines
homosexuality in similar, hypersignificant, cosmically politicized terms, as the willed death of "biology" in a
declining civilization (210).
For background on homosexuality in this period, see Allen Bérubé, Coming Out Under Fire: The History
of Gay Men and Women in World War Two (New York: Free Press, 1990); Robert J. Corber,
Homosexuality in Cold War America: Resistance and the Crisis of Masculinity (Durham, N.C.: Duke
University Press, 1997); Donald Webster Cory, The Homosexual in America: A Subjective Approach
(New York: Greenberg, 1951); John D'Emilio, Sexual Politics, Sexual Communities: The Making of a
Homosexual Minority in the United States, 1940-1970 (Chicago: University of Chicago Press, 1983); Lee
Edelman, "Tearooms and Sympathy, or, The Epistemology of the Water Closet," in The Lesbian and Gay
Studies Reader, ed. Henry Abelove, Michele Anna Barale, and David M. Halperin (New York: Routledge,
1993), 553-74; Charles Kaiser, The Gay Metropolis (Boston: Houghton Mifflin, 1997), 1-202; and Jess
Stearn, The Sixth Man (Garden City, N.Y.: Doubleday, 1961). There was, of course, an extraordinary
group of women performers in various media--Billie Holiday, Rita Hayworth, Betty Grable, Marilyn
Monroe, and Dorothy Dandridge among them--as well as a group of talented women writers headed by
O'Connor, Jackson, McCullers, Sylvia Plath, Anne Sexton, and the lesbian novelists Paula Christian and
Ann Bannon, all of whom are important to my study. It seems symptomatic of the period, however, that
my gender heroine thus far is Christine Jorgensen, who became the first American transsexual in 1952.
An army veteran, she liked to remind audiences that she alone knew the professional and private secrets
of both sexes. See Christine Jorgensen, A Personal Autobiography (New York: Paul S. Eriksson, 1968);
and David Harley Serlin, "Christine Jorgensen and the Cold War Closet," Radical History Review 62
(spring 1995): 137-65. For an analysis of the gender confusion endemic in cold war diplomatic language
and policy, see Frank Costigliola, "The Nuclear Family: Tropes of Gender and Pathology in the Western
Alliance," Diplomatic History 21 (spring 1997): 163-83, and idem, "'Unceasing Pressure for Penetration'":
Gender, Pathology, and Emotion in George Kennan's Formation of the Cold War," Journal of American
History 83 (March 1997): 1309-39. When the feminist movement upstaged the masculine revolt in the
media and the gay liberation movement asserted queer identity as an open, natural, and independent
gender position rather than as a covert, hostile parasite upon traditional gender norms, the early
postmodern era was definitively over.
Download