Christian Moraru and
University of North Carolina, Greensboro, USA
c_moraru@uncg.edu
Phyllis W. Hunter
University of North Carolina, Greensboro, USA
pwhunter@uncg.edu
Cultural Myths of Global-Age America: Toward A Critical Glossary
Abstract: Part of a bigger enterprise, this is an inquiry into the media and media-reinforced cultural mythology of the United States in the post-Cold War globalization era. The format is the “critical glossary” or “lexicon” made famous by a number of authors, most famously—in recent critical theory—by Roland Barthes. Basically, the glossary below represents one possible attempt to imagine how Barthes’s Mythologies might look like if one rewrote it with the 21st-century U. S. as its focus. The methodology of counter-mythography developed below represents, in this context, both an extension and a critical retooling of Barthes’s approach.
Keywords: Cultural Myth; U. S. Cultural Matrix; Cultural Imaginary; New Materialism; Media; Popular Culture; Counter-Mythography; Mythical “Streamlining” of Cultural Practices; Myth as Metalanguage; Myth, Ideology, and Alterity.
Here I am, before the sea; it is true that it bears no message. But on the
beach, what material for semiology! Flags, slogans, signals,
sign-boards, clothes, suntan even, which are so many messages to me.
Roland Barthes, Mythologies
1. Introduction:
Myth, Ideology, and Counter-Mythography[1]
Cultural Myth. What follows is part of an ongoing project that provides a cross-section of the cultural mythology of the United States at the dawn of the 21st century. This undertaking is a critical glossary of sorts. Deliberately reminiscent of Roland Barthes’s landmark 1957 Mythologies and Raymond Williams’s 1976 Keywords—but perhaps also of Gustave Flaubert’s posthumous Dictionnaire des idées reçues—the Glossary catalogues and analyzes a representative cluster of cultural myths that, we argue, structure the material imaginary fashioning the public and private lives of Americans at the turn of the new millennium.
Simply put, what the Glossary sets out to do is identify and scrutinize a set of myths such as objects from clothes to food, hi-tech gizmos, and other similar commodities; images, symbols, obsessions, and icons; celebrities, discourses, and products of the mass-media, popular culture, and visual-digital worlds generally; historical as well as virtual figures, events, and places; and everyday-life cultural practices and formations from commercials to games. Together, these constitute, we maintain, the mythical matrix of today’s U.S. culture. As such, we also contend, they hold high identitarian value, for it is in relation with them that America and Americanness have been experienced and imagined during the past few decades.
This lexicon is critical in that it is grounded consistently, if flexibly, in a specific methodology, as described below; it reflects, as one would expect, the author’s expertise, interests, and choices; and it supplies, therefore, one possible way of mapping out American cultural mythology. Barthes’s treatment of Balzac’s 1830 “scène de la vie parisienne” Sarrasine in S/Z comes to mind here: not unlike the textual body of the Balzacian novella, the continuum of U. S. cultural mythology can be broken up into mythical units in any number of fashions, depending on the critical grid used. We tend to dwell on the material embodiments of American mythology, on the humbly and perhaps surprisingly mythical in its quotidian-utilitarian incorporations, although, occasionally, classical American myth as studied by the R. W. B. Lewis-Leo Marx-Alan Trachtenberg school of Americanists and the kind of myth that usually draws me do cross paths. The Glossary develops, in consequence, a particular and deliberately idiosyncratic list of myths and interpretations of individual myths. Needless to say, other critics may have their own lists or their own takes on the myths on my list. Where we see the imprint of myth, they may not see much; or, they may see something where we do not; or, they may discern a different myth in the same object or phenomenon, and so forth.
As suggested earlier, the mythology in question underpins the identity-shaping material routines and fantasies of our time—the late-global U. S. Late globalization is, then, the cultural-historical context within which we carry out our survey and on which, as explained below, we bring to bear a critical method, a cultural mythanalyse (Gilbert Durand) or, more exactly, a counter-mythography of sorts, attuned to today’s America and its cultural-material landscape. To invoke Durand once more, Americans are leading their lives within a “mythical décor” lodged at the crossroads of the present and the past, here and elsewhere. This “mythical ambiance” features a contemporary, unfolding yet identifiable configuration both moored in tradition (evolved or invented) and embedded in the complex shifts brought about by the post-World War II era, especially by the last several decades, a time span some historians refer to as late globalization. While this period’s roots, much like the mythology defining it, reach deeper in American and world history, the cultural myths we zero in on speak to the present moment and to the impact its geocultural, political, and technological developments have been making on how Americans live and view themselves and their world.
In the Glossary, myth is not the classical, generic notion the metaphysical-religious and archetypal-anthropological tradition James Frazier, Carl Gustav Jung, Mircea Eliade, Joseph Campbell, Gaston Bachelard, or even Claude Lévi-Strauss belabored, although I do use their suggestions occasionally. Here, myth is, by and large, cultural myth in the sociology and material semiotics line of more recent critics who have seized on myth in its material articulations into concrete everyday objects, as “things” and as the practices and sites associated with them. These critics include: Barthes and other French mythologues drawing on the Mythologies paradigm such as Jerôme Garcin and his contributors (Nouvelles Mythologies), Michel Maffesoli (Iconologies), and Julia de Funès, who is also a latter-day Flaubertian keen on deflating our philosophical clichés (Coup de philo . . .); Jean Baudrillard (we refer chiefly to his work on consumption, objects, simulacra, and American culture); Pierre Bourdieu (we found his work on taste and the field of cultural production still useful); Umberto Eco (and his funny Travels in Hyperreality); and John Frow, Bill Brown, David Miller, Barbara Johnson, Yuriko Saito, Bruno Latour, and other “thing theorists,” “new materialists,” “everyday aestheticians,” and “actor-network-theorists.”
In the “culturalist” line of thought, myth is an image, object, cultural site, expression, or discourse that, following a process of mythical investment, becomes vehicle for an idea, notion, representation, understanding, penchant, or disposition (à la Bourdieu), in short, a concrete embodiment of a symbolic value that encodes and structures the material-imaginary life of a community. In other words, that community conducts its daily rites, accounts for them, and thus for itself, in conjunction with, if not under the spell of, a set of symbolic objects, cultural protocols, or discourse constructs that come into being as they re-signify preexistent objects, protocols, and discourse constructs. Sometimes deliberate and more often than not unconscious, this re-signifying operation has two major components: one is semiotic (but is also described, by Barthes himself, in rhetorical terms, for it is a matter of language, or rather metalanguage); the other is ideological. The two work together like the two arms of a scissors. This means that
i. the mythical signifier (e.g., the luxury-class Lexus sedan as symbolic value) retools the meaning of the original object (the actual vehicle as “transportation,” as use value) in order to produce a signification (the Japanese car as a token of discriminating, individualizing choice, “uniqueness,” and socio-intellectual status);
ii. this signification is ideological in its consequences; and
iii. as we shall see below, this also implies that myth bears traces of its making, which both block and facilitate what we call counter-mythography.
But how does ideology work here? To answer, let us first reiterate that, privately and publicly, American identity has been constructed in relation to a series of characteristic cultural myths. Let us note, second, that the key function of these myths has often been to present as commonsense and self-explanatory certain parameters, ingredients, and formal paradigms of such construction protocols. Third, if this is the case, then uncovering the workings of these myths is something of an urgency. For these myths’ action is not only constructive—we will allow as much momentarily—but also prescriptive and even deceptively restrictive. They do have a regularizing effect on a sociocultural environment in which individuation phenomena are pressured to fit extant patterns of culture and thus restage selfsameness. In that, myths put human diversity and cultural production at risk as they tend to “streamline” culture and constrain dreams, fantasies, expressions, and customs by rendering alternatives as less “natural” and, in the particular, American context, “aberrant” and “deviant.” The implication, then, is that that which is “innately human” (and subsequently “typically American”) is also the self-evident standard, an assumed, collective benchmark.
It is in this sense that the cultural mythology of the global-age U.S. lends itself to interpretation as a repertoire of commodified and regularized alterity. As Barthes also writes, myths of this sort bespeak an inability to picture “others” while making speaking, both in public and private spheres, and with it representation, discourse, and exchange possible. It follows that, insofar as we speak within a cultural-mythical framework, our speech “un-speaks”—ignores, short-shrifts, casts aside—fundamental aspects of life and its history. Overall, this is the drama of any language, as we expound later on in detail; for now, what bears recognizing is that an interdisciplinary project such as Barthes’s has gone a long way toward shedding light on that which in the surrounding mythology threatens to rein in the sociocultural imagination and thus foreclose alternatives and change inside and across communities (see Jean-Luc Nancy, The Inoperative Community). Such threats, we think, are today more marked than ever before given the homogenizing thrust of mass-media-disseminated, global processes in the U.S. and abroad. In fact, one of the major critics of rising American mythology, Baudrillard, repeatedly pointed to the “culture of in-difference” to which leveling myths, symbols, and imagologies are giving rise under globalization in the world at large no less than on American soil. Our reading of myth reinserts this mythology into the system of differences and possibilities mythical mechanisms have flattened out and brought into line and around which many of our reflexes, images, and self-images have been constructed.
It should be obvious by now, no foray in cultural mythology can afford to ignore Barthes. We draw from his work too, but rather loosely, revisiting his core concept and method for a time and place so saliently distinct from the France of the late-modernist 1950s. Thus, we make little use of his semiotic-rhetorical apparatus; truth be told, Mythologies’ reasonably informed reader can manage without the theoretical nitty-gritty from the first part of the book’s closing essay, “Myth Today.” What we retain is primarily the notion of myth as “stolen language” or metalinguistic artifice that appropriates prior language to hide something, to designate something originally absent from or only partially present in that word, image, narrative, or material object, and otherwise to make us react in a certain way. Indeed, myth often “robs” language of initial meaning to make room for the mythic message. The problem of myth, then, is largely the problem of the host and the parasite. Quintessentially parasitic yet hardly arbitrary—there is surely a “logic” in play here—mythical connotations take over denotations, feed on them to set themselves forth with variously disguised rhetorical urgency: indeed, mythical signification tells Americans quite peremptorily, if not in so many words, how it wants to be read, that is, how it wants them, as participants in the culture, to act and see themselves.
Methodology. The method we bring to bear on early 21st-century U. S. cultural mythology rubs against the grain of this reading in that we strive to undo what mythical metalanguage has done to the objects whose meanings it has reworked. Cultural myth, it has been remarked, “naturalizes” a certain message, advertises it as self-evident, factual, innocent (apolitical) by camouflaging or, more precisely, by de-contextualizing its signification and production. On this account, one major component of our method could be loosely described as a culturally deconstructive psychoanalysis of sorts seeking to reinscribe the myth into the complex of meanings, protocols, and effects that mythical rhetoric suppresses, dampens, or disguises. To understand the role of the other methodological component, let us observe that, to set themselves up as natural and thus paper over their own—usually problematic—constructions, political affiliations, and ideological ramifications into more or less disguised hierarchies of race, class, gender, sex, ethnicity, or faith, myths routinely place their politics and histories under erasure, de-historicize and de-politicize themselves rhetorically. So what is also called for is a reading capable of reinserting myth into the conflictual, equally problematic histories to which it belongs. Active in the present and bearing on this present’s a-historical self-perception, cultural myths come from a past that must be brought to the surface: this is the “genealogical” or “archaeological” facet of our approach, which retrieves incidents, places, emotions, and implications that open up myth to uncover the meanings and occurrences it has reduced to a politically-ideologically convenient scheme.
In a nutshell, these are the tools we use to put back together the myth’s clarifying context. This reconstruction is then
i. first and foremost, psycho-cultural and political, for it shows how the myth ties into, and molds, a certain social-political and imaginary space, how it shapes actions, desires, and mentalities;
ii. historical, for it traces present myths to past negotiations, struggles, and exchanges as needed;
iii. geocultural or, more specifically, transnational, and global in some cases, for at times it has no choice but to pursue those interactions beyond the U. S.
In brief, we rewrite the myth critically, diachronically and synchronically, working its complex history, meaning, and repercussions back into the myth’s rhetorically strategic spectacle of simplicity, self-evidence, and isolation. We write back in what the mythification process has written out of the mythical object, phenomenon, or catchphrase. In exposing the myth’s synecdochic maneuverings, we explain how “impoverished myth” (Barthes) means and affects more than it may seem at first blush.
Our counter-reading also moves beyond the Frankfurt School-inspired, cultural studies tradition of critique. If, as Barthes’s commentators have emphasized, today’s America is not Barthes’s France, popular culture—postmodern or even “post-postmodern,” as some argue—is not modernity’s mass culture either. It is in this context that recent research (e.g., Coupe) makes the valid point that myth and ideology, even when the latter is understood in the evolving Althusser-Jameson-Žižek line, are not necessarily synonymous. To be sure, cultural myth “distorts” and “manipulates.” It passes itself off for something that it is not, not “inherently,” or not to the extent that it claims. In that, Barthes’s model is still useful. However, due to postmodernity’s oversaturated intertextuality and ever-expanding “agent” networks (Bruno Latour), what this myth is and what people in turn are or become as they “do things with myths” (and as myths do things with and to people) have changed. Myth can still “mythify” (read: mystify, deceive), leading to “false consciousness” under the mechanically adopted strictures of cultural consumption and ideologized reflexes. It interpellates us after a fashion, intimates that it wants us to respond to it in a certain way, to “play along.” It winks at us, toys with us, leads us on chiefly to “set us up,” but in the process ends up giving away more than it intended or, in some cases, perhaps no less than it did intend to let on. This is myth’s “double-voicedness,” as Mikhail M. Bakhtin would have termed it, or what we call myth’s double-bind. We employ the concept to stress that most myths both conceal (dissimulate) and reveal, even “instruct,” convey to the trained eye key truths about ourselves as a culture. It is up to the counter-mythographer to figure out what, in a certain myth, must be taken at face value and read with the grain, and what is to be subjected to a counter-reading working against the myth’s grain and its rhetorically dissimulating apparatus.
We can say more confidently following postmodernism and recent cultural analyses thereof (Stuart Hall, John Fiske, John Frow, Bill Brown), there is manipulation by myth and manipulation of myth; there are, among mythical consumption forms, types of mythical performance ranging from reproductive consumption reflective of myth’s ideological thrust to more productive, inventive, and even critical forms of mythical discourse. As one sees the world and myself, one may be unaware of my mythical investment. Alternatively, one may engage with myth—one may immerse myself in the American quotidian—in ways attesting not only to the myth’s resilient power but also to one’s own willingness and ability to engage with America’s mythical matrix. When effective—discerning, less ideological—this engagement does more than just buy into myth’s (pseudo)rationality. Alert to “mythical unreason,” counter-mythography opts for a critical relationality instead, placing myth in the context, past and current, material and ideological, out of which it struggled to extricate itself.
In the same vein, the other connection worth pointing out is between host and parasitic language. In their early works, Barthes and Baudrillard hinted at a fairly trenchant distinction between the two. In turn, the differentiation implied the existence of an authentic, natural, pre-petit-bourgeois and pre-modern, pre- or non-mythical language and reality encoded by that language. In the wake of later language critiques, to which the two critics themselves have been instrumental in the late 1970s, this dichotomy strikes us as tenuous. No more airtight is the case Barthes made for the immunity of avant-garde and poetry to mythical cooptation. We conclude, then—and against the backdrop of late postmodernism’s continuous, ubiquitous, and complex recyclings of discourses—that there is no pre-mythical language/reality, and that there is no a-, non-, or post-mythical language/reality either. As our critical mythographies show, the host or fulcrum of today’s myth is language and other symbolic activities that have themselves played mythical roles already. Thus, the question is no longer how to step outside (before or past) myth; the question is how to handle the mythical texture into which our lives are woven. We do not differentiate primarily between an authentic and an inauthentic reality and, analogously, between a first-order and a second-order semiotics, but between authentic (critical) and less authentic (less critical) forms of performing myth in a world where few idioms and spaces (if any) can be said to lie safely outside mythical performativity. There is no place for Thomas Pynchon—or anyone for that matter—to hide from the sweep of mythical re-signifying; the place to be is, in fact, the very mythical arenas such as The Simpsons episode where, paper bag over his head, the famous writer can attempt to turn the myth-making operation on its head. This example also goes to show that some of us cope with myth and its ideological pressure better than others. Conversely, some myths are better marked as such; they are “hard”—more “obvious” or more obviously encapsulating symbolic meanings (Superman, Dirty Harry)—while others are “soft,” less individualized or less conspicuously mythical and thus calling for more patient unpacking; there are the usual suspects perhaps (Die Hard’s John McClane as a latter-day American Adam), and then there are the unsuspected and the unexpected (this popular dish, that toy, or the furniture store chain offering the whole world “to go”), for whose critical “outing” the counter-mythologist may try and make a case.
2. A Sheaf of American Myths
Dirty Harry. The Harry Callahan character played by Clint Eastwood in Don Siegel’s 1971 Dirty Harry and its four sequels—Magnum Force (1973), The Enforcer (1976), Sudden Impact (1983, directed by the actor himself), and The Dead Pool (1988)—and variously recycled across Eastwood’s half a century career from the hugely popular spaghetti westerns of the 1960s such as For a Few Dollars More (Per qualche dollaro in più, 1965) to the 2008 blockbuster Gran Torino flaunts America’s longstanding, anti-rhetorical bias. Dirty Harry is an American hero because he is the consummate antihero, which he is insofar as he spurns rhetoric as verbosity, waste of time, and weakness. He represents an action hero twice, narratively and ideologically. Protagonist of an “action-packed” police thriller series, he also bodies forth an individualist, “pragmatic” philosophy of action and “effectiveness” dismissive of empty talk and bureaucratic formalism. Notoriously, Dirty Harry “gets results” by ignoring, bending, or breaking police rules and regulations, and more broadly the laws and stipulations that make a “man’s” job complicate, even “dirty” it by rendering it unnecessarily ambiguous, a matter of endless procedural nitpicking, courtroom arguments, and legalistic loopholes—in brief, by turning a case and its hard facts (a bank is being robbed; a woman is chase by a naked man brandishing a knife; a serial killer is on the loose, etc.) into annoyingly sticky rhetorical issues.
A straight shooter in more ways than one, he is brevitas in action. Neither as theatrical as John Wayne back in his days nor as outgunned as Bruce Willis will be in the Die Hard movies, Dirty Harry is not only less loquacious than either; he makes, in fact, laconism into a prime deontological marker. Callahan does not like the “jobs” he keeps getting, yet he does believe, if not in so many words, in the social and moral import of his job overall at a time—the 1970s and the first half of the 1980s—when, bogged down in red tape and “squabbles” over suspects’ rights, U.S. law enforcement looked helpless in face of rising urban unrest and crime. He is a professional, but professionalism implies, as his demeanor suggests, a carefully measured performance whose linguistic and gestural economy affords few if any superfluous words and acts. He is not itching to say things, and he does not rush to do them either. Both a reluctant talker and doer, he is typically “brought in” (he arrives at the “scene” in the nick of time), and even then, he acts when pushed to the limit by villains who fail to know their “limitations”—“[A] man’s got to know his limitations,” he famously let us know in Magnum Force. Lethal, his action usually constitutes a reaction whose spectacular, excessive nature is legitimated not only by the bad guys’ over-the-top, gratuitous violence typically directed at the defenseless but also but also by such aggressiveness’s pompous, quasi-hysterical, and conspicuously disproportionate expression.
Now, action is here dramatized as an inconvenience and as such a last resort. In that, Dirty Harry is a Buddhist or perhaps a latter-day Spinozian, for it is not always sufficiently clear that there is a need to act at all. However, if one has to act, though, the acts should be silent. Their righteousness should be self-evident, requiring neither documentation nor deliberation either publicly or psychologically, in the hero’s inner forum. Action dispenses with nuance, thorny qualifiers, messy politics and ethics, and their formulation. Getting the job done, dirty as it may be, entails, verbally at least, silence or at most a minimum of verbal expression because linguistic conciseness inherently alludes—flags without expressing, without “showing off”—uncommon reserves of physical (“actual”) strength. Elliptic yet unhurried sentences, sparse replies—Dirty Harry’s most memorable words are also reactions, rejoinders—and ultimately silence are the tough guy’s “tirades” and make for a counter-rhetorical rhetoric to the extent that, in the American popular imagination, rhetoric remains associated with a braggadocio sort of moral category, with wordiness and immoderate verbalization, which in turn are signs of weakness—“all hat and no cattle,” as the Texan saying goes (during her 2008 run for the Democratic nomination, Hillary Clinton used the expression to draw a contrast between Obama’s rhetorical skills and his lack of concrete “action”).
Vice versa, strength, if authentic, “job-oriented,” finds its expression in things done—“reluctantly” as Dirty Harry does them—rather than in things said. Simply speaking, toughness is neither verbalized, of verbal nature, nor gesticulatory. In this respect, the Eastwood type is “tougher” than both the John Wayne type in a western like The Outlaw Josey Wales (1976) and the Charles Bronson sort, whose Death Wish series, also focused on inner-city turmoil, victims’ rights, and the vigilante figure, is possibly Dirty Harry’s stronger competitor in the 1970s. By and large, though, the Eastwood character comes out ahead. Some of his memorable phrases do “sound” tough. One of the most famous, “Go ahead, make my day” (Sudden Impact), was used, no less famously, by Ronald Reagan himself for its “plain and direct” message during one of his own showdowns with Congress.[2]“Bring it on,” George W. Bush will say when his turn will come (and the question is, of course, if recent advocates of “ethical violence” do indeed say something fundamentally different). But, overall, Callahan need not talk tough in a culture that encourages its participants to let their deeds do the talk. Instead, he looks tough, and therefore, public wisdom has it, he must be so. The fabled .44 Magnum does its part, of course—Dirty Harry’s Scorpio cannot help describe it, quite admiringly, as a “big one”—but neither the handgun nor Callahan’s other arsenal (rocket launchers, harpoons, bombs, etc.) defines his look as much as it does his look itself, the way he looks at people. His gaze judges and crushes at the same time, simplifies and packs all the remonstrations left unconveyed linguistically. Because it speaks volumes, the look can do away with words, and it characteristically does. In effect, it is devastating because is unaccompanied by them, and it is only as such, in total silence, that the deadpan, expressionless (“de-rhetoricized”) look becomes, against the hero’s will, not only a harbinger of the power just about to be unleashed but in and of itself sheer display of power and so, quite paradoxically, a rhetorical manifestation. Incidentally, Sylvester Stallone in the Rambo films—where silent strength goes stealthy—and Arnold Schwarzenegger in the unending Terminator saga present us with the “buff body,” “machinic” versions this look. The versions, some might say, degrade the original, much as Schwarzenegger tried to imitate Callahan (the Austrian action icon has told an interviewer that he has seen Dirty Harry five times). With both Stallone and Schwarzenegger, though, technologically enhanced physicality stops being an accessory and takes center stage. The buff body is less where the look simply originates, less the site of a gaze trained on you, villain or spectator, and more something you cannot but look at. Highly obtrusive, this corporeal visibility is “loud.” It does here what “talk” and grand gestures are guilty of in Callahan’s book, that is to say, makes for an extraneous, overdramatic rhetoric that just “tries too much.”
Yet again, it would be naïve to think that Eastwood’s impassible gait down a main street in a western is not a style, that the loner’s unassuming, phlegmatic rhetoric is not one. It is, and it requires patient rehearsal, for simplicity, straightforwardness, reticence, “factualism,” and so forth in reality are not that easy to come by. They are surprisingly intricate constructs, entail sophisticated processes to work them out and, no less important, to efface these processes. Effacing, erasing the “rhetoricalness” of hard facts, of things “themselves” should not be underestimated here. Facts, things, acts, what we do as opposed to what we say and thus presumably “spin,” “embellishing” that which things in and of themselves “are,” are never cut-and-dried. By definition, our jobs, our acts overall, are interpretive in nature, in what we “do,” no matter what we actually do. In this regard, everybody’s job is “dirty,” tainted by language and its play. Dirty Harry, c’est nous, no doubt. It is also beyond question, however, that, also like the Eastwood hero, we must oftentimes cope with the undue pressure of an ideology of acting, and that such pressure risks jamming the deliberative-procedural protocols of a healthy polity.
American Idol. The point is not that Fox Network’s American Idol does not feature—some might say, “produce”—legitimate artists. Demonstrably, it has. A recent winner like David Cook, for example, has plenty of talent. David Archuleta, Carrie Underwood, and others have received Grammy nominations and have been signed by big-time labels. Their record sales have been nothing to sneeze at either. The point is rather what this talent, this artistic legitimacy, does, to what kind of work, culturally and otherwise, it is harnessed within a performative setup shaped jointly by the celebrities-to-be, the famous or reasonably prominent jury members, and the voting audiences.
In fact, the right word may not be talent but skills, technical abilities. Our “idols,” the top finalists at least, are skilled, but theses skills are almost exclusively geared toward recycling the cultural system within which they are demonstrated. Quite telling is, along these lines, content itself. As often as not, what the contestants perform is “covers.” That is, they sing songs, some of them hits, by noted artists, on which the rising stars scramble to put a “personal” touch. But, by and large, in this context performative legitimacy comes down to the ability to relegitimate the set of aesthetic-spectacular rules, values, and expectations on which the show rests and, beyond it, American society’s larger repertoire of standards and conventions. “America’s got talent,” of course. But, in this case as in others, the “idol”’s talent is a matter of skill, speaking as it does to a mechanical-repetitive dexterity fostering cultural sameness rather than to novelty-spawning creativity.
For, to be sure, the “idol” is a “dreamer.” American Idol is, along with so many other pop culture phenomena, a prime-time riff on the American Dream narrative, a variation on the national theme and history of identity production. In this view, the show’s biography too is significant. It all started in 2001 with Pop Idol, a U. K. reality/variety TV program created by Simon Cowell, American Idol’s most (in)famous jury member and also producer of shows such as American Inventor. Cowell has been identified by commentators as the most influential member of American Idol’s judging team, and for good reason. Variety’s 2006 U.K. “Personality of the Year,” he is essentially the test the American aspirants must pass; they must prove themselves before and against him. In a sense, they compete against him—against his cultural-aesthetic and psychological idiosyncrasies. His reactions usually vary from unceremonious and ungallant to brutally discourteous, openly inconsiderate, and outright rude. Uninterested in taking prisoners, he would not give you a second chance either. He takes no pity in your pain. Stern, to the point, he can be and usually is indelicate if not nasty, and yet seldom off the mark. His sophistication is concise and mordant. Blunt to an extreme, he gives it to contestants straight, with no apparent sympathy for their predicament. In sum, he plays the villain in a drama of personal morphing—“Watch the[contestants] grow,” urges show Director Michael Orland—where the public also experiences vicariously its own possible transformation in the effigy of the wannabe idol and where, to become something else, somebody famous, recognized and remunerated for his or her talent, one must overcome Cowell’s resistance.
It is of course important both the resistance and the recognition be British. The winner’s ultimate triumph reenacts, in effect, the American independence narrative, retells a collective story of cultural and political emancipation and in turn depends on the collective unconscious to identify this heroic story in the performer’s heroicomic ordeal. This is the two century-old story of American originality, of the U.S. as an original culture admittedly indebted to the British but eventually come into its own on its own terms, arguably in languages and styles of its own making, specifically American. Cowell plays the role of the gatekeeper to sufficiently original and thus self-sufficient national identity. What his comments disparage openly and obliquely is “our” ability to be, as the U. S. Army commercial says, “all we can be”; his presence and verdicts imply that “we” can be at most, and we should be no more than, British surrogates. The yardstick with which he measures the participants’ performance is by and large impersonally technical, putatively “universal” (European), and has little if anything to do with the particular traditions of American pop. In obstructing the singers’ rise to fame and fortune, he stands for that which, in our heritage, held us back as a nation. In his rejection of various singers, one hears a wholesale dismissal of American cultural history. He has no second thoughts about the validity of his criteria and judgments, but that is because he has plenty of doubts about American culture and its claims to originality. For this reason, the recognition he offers so sparingly matters much more than Paula Abdul’s saccharine benedictions, by comparison exceedingly generous. We have been supposed to worship alien (metropolitan) “idols” for centuries; now, at long last, we are producing our own. This production is overtly sanctioned by the “people,” who phone in their votes. The Brit has vetted the process, and his recognition that “America’s got talent”—the title of the show Cowell himself has produced—is certainly important. But it simultaneously marks his “defeat” and, correspondingly, the birth of our cultural independence, American originality.
Still more important is the involvement of the “American people,” who “recognize” themselves in the idol’s performance. This recognition is no less problematic, though. If Cowell signifies, clearly more than other jury members, a foreign/elite/patronizing judgment passed on our capacity for cultural self-determination and original expression, the public marks an ideological shift, a populist turn, one might say, in the show’s narrative. Cowell’s presumptions and modus operandi are elite and restrictive. He weeds “us” out. He believes in, and “autocratically” enforces, an arrogant oligarchy of the talented. He is not “democratic” (after all, the British are a monarchy). On the other hand, “we” are. We vote. The jury does too, but our votes, their sheer number, to be more exact, is what ultimately counts. Thus, in its final stage, the show shifts away from an arbitrary aristocracy of the talented few to a no less questionable, quintessentially populist democracy of talent and with it to a new set of rules. Ironically enough, these rules are equally inimical to the otherwise much-prized and loudly advertised originality. And so, at the end of the day, American Idol is national karaoke night: less riotous, better managed, and considerably more lucrative. Originality may be its mot d’ordre, but this originality preexists and stands ready to be repackaged and thrown in Simon’s face. It is not the “sound” of the performer that matters most to what we expect and hear, but the “sound” of the re-performed artist whose song and/or style the new idol copies “to perfection,” with quasi-negligible and quickly forgotten “innovations.” The show has produced no breakaway, truly creative celebrities or styles. American Idol does speak to our collective anxiety of being perceived as “copies” of the British and of others in general, hence to a deeply rooted and historically motivated yearning for new things, for newness generally, for being “ourselves.” In the final analysis, however, it is no more than a rehearsed exercise in mimicry, a site where American culture accommodates itself, acknowledges, repeats, and impersonates itself with minimal deviations. In the winning idol, we idolatrize ourselves, what we have been and done, and what we will be and do, all over again.
Gated Community. Many first-time visitors to the U. S. are struck right away by the absence of “real” fences between most residential properties, whether these are individual homes or apartment complexes. Indeed, in big cities and even more so in small towns and suburbs, the demarcation lines between spaced-out, single-family houses or between larger housing units do not exist physically. Or, if they do, they are rather unremarkable, architectural afterthoughts. One only assumes their existence, and, along with the imaginary boundaries it erects, this assumption is usually enough to alert if not dissuade trespassers. Primarily in the northern and mid-western parts of the country, high walls, wooden fences, iron gates, and so forth are the exception rather than the rule. You see them less than you might expect because, more often than not, their presence, their aggressive visuality, is not deemed necessary. Homeowners feel that the partition between one house and another—and more generally between “mine” and “yours”—need not be sanctioned by the traditional signifiers of separation one runs across in European neighborhoods, for instance. More exactly, the dynamic of material separateness and proprietorial prerogatives seems to be different in North America, where historically real estate ownership has not entailed the material reinforcement typical of other countries.
Make no mistake: fenceless does not mean defenseless. Nor are property markers entirely lacking. They are in place; it is just that they do not cordon the place off. That is to say, there is marking but no literal, insulating demarcation. Division fully exists, legally and otherwise. Yet it is not blatantly “topological” but “tropological,” a hint, a trope, and a convention, either completely invisible—you see it on property maps only—or symbolically inscribed into the surrounding landscape as natural and artificial signposts: bushes, trees, flower beds, gardens, rocks, hedges, miniature, overly artisanal, ordinarily white fences, mailboxes, and gazebos, not to mention front yard art objects such as the ever-popular dwarfs and deer, etc. All these understate ownership and its authority over property. There is something “literary” about them, for they are not straightforward. They do not scream their message at those walking by; they allude to the property limits with a certain studied reticence, with a coy eloquence that deliberately falls short of overt belligerence. They gesture to an ad quem (or ab quo) point. Importantly, they are not boundaries but boundary figures. They evoke the limit obliquely instead of embodying it, another reason the house dog’s own house, if the pet has one, lies in the backyard—not to mention that the pooch is restricted not by inhumane chains but by the so-popular “invisible fence.” Thus, such these are neither enclosures nor gates proper but metaphors and metonymies thereof—a shrub stands for a fence, a four-foot long fence signifies the whole fence—yet they suggest that you act as if you stood in front of actual fences or gates. The visual-liminal compact holding sway here—though not so visibly—has it that immaterial, implicit, allegorical, or diminutive boundaries remain effective as boundaries while declining to assume the standoffish rhetoric of architecturally fleshed-out exclusion. To be sure, they are there, and it is not at all clear that you, the passerby, would be welcome within them. In fact, if the manicured lawns are any indication, you would not be. But, by and large, such unassuming physicality does not explicitly keep you off either, does not register your presence as threat or disturbance.
In a culture so thoroughly shaped by the twin myths of the self-reliant individual and the frontier, the rhetorical reluctance of liminality characteristic of most residential spaces may strike some as odd. Yet, to reiterate, the point is not that the limit does not exist or does not operate, but that the multiple interruptions and fragmentations it foists on the sociocultural and environmental continuum of the neighborhood and of the wider community are rhetorically subdued. As elsewhere in America, a “soft” rhetoric is in play here. It may not call openly for neighborly participation, interaction, sharing, or responsibility across property boundaries, but it does not go out of its way to disrupt such practices either.
Not so with “gated” or “planned” communities. There were about 20,000 such developments in the U. S. in 1997, and there are many more today, with a steadily growing number of people—no longer solely affluent whites—eager to live inside their walls.[3] Now, most gated communities are not walled, and many of them are not even gated. On the other hand, other such places have upped the ante of the soft rhetoric of symbolic enclosure, what with the electronic gates and barriers, checkpoints, security guards and their guardhouses, and, last but not least, the communal-living policies enforced by the ever-vigilant Home Owners’ Associations, which make sure all residents cut their grass regularly, do not park their cars in the street, paint their houses in the “approved” colors, and so on. Either way, gated or not, these communities are, according to developers, residents, and their testimonies, websites, and brochures, principally about safety, secondarily about privacy, property value, shared living standards, and, more broadly, about shared culture. To such goals, uniformity of home design, landscaping, and overall appearance (all of them derided in a famous 1999 X-File episode), regulations of outdoor activities, and, first and foremost, access control and monitored perimeter serve as means.
Whether the latter reach their goals or not—the extent to which the “hard” rhetoric of literal (rather than “literary”) boundary and deterrence brings about safety, no soliciting, clean streets, and the like—remains a bone of contention among homeowners, real estate agents, urban planners, activists, and critics of “fortress America.” There are data showing that in gated developments or in older neighborhoods that retrofitted themselves with gates crime is lower than in other areas as much as there are statistics indicating that gates, roadblocks, redirected, curbed, or monitored incoming traffic, and the rest have left things unchanged. Some critics claim that “forting up” a particular area makes it a more likely target for burglaries and assaults, which, the argument goes, are rather invited rather than deterred by the absence of people in the sidewalkless streets.
Ironically enough, the real issue is not what the “outside world” might or might not do to the world behind the gates but, quite the opposite, that is, what the latter has been doing to the former, specifically to American cities. The gated community is not simply another community within the existing one but an enclave within or, better still, an enclosure that projects itself as external to the city inside or outside which it lies and, on this account, not responsible for the city’s broader welfare. Thus understood, “community” is counterintuitive at best, a conceptual travesty if not a demise of the communal altogether, a contradictio in adiecto. For, among other things, community presupposes affiliations and obligations, but here neither are more than administrative and fiscal nuisances gated communities put up with unless they manage to secede completely (an ongoing Californian trend) and set themselves up as independent urban units far from the inner cities’ madding crowds—another indication that planned developments’ modern spread at home and abroad is by and large a post-1970 phenomenon bound up with the decline of historical downtowns and neighborhoods and more generally of communal life across the U. S., then with the “white flight” and the ensuing rise of suburbia. It bears noting, of course, that not all suburbs are gated (entirely or partly) and, further, that only some of them are too posh to be economically prohibitive.
And yet, walled, gated, and otherwise closed-off residential zones, with their cookie-cutter approach to domestic architecture and their exclusive tennis courts, swimming pools, playgrounds, clubhouses, and other facilities of this sort, take to a radical extreme the late-1960s-early-1970s suburban utopia of socioeconomic and racial rehomogenization, even though, it must be said, many American suburbs themselves have meanwhile become remarkably diverse. In any event, to many observers, this extreme, this exaggeration, goes against the grain of multiculturalism. Nor, some underscore, does the self-isolationist trend make much sense in the age of global networks, contacts, and cultural-economic mobility, while, quite to the contrary, others believe that globalization itself is basically responsible for the post-1990 proliferation of planned communities throughout Eastern Europe, the former Soviet Union, China, and the Gulf countries (where many such developments, heavily guarded, house foreign workers).
Whatever their rationale and no matter what else they set out to accomplish, America’s gated communities ratchet up historical residential spaces’ soft rhetoric of symbolic separation in an attempt to step outside both time and space and the responsibility toward others that comes with being part of established geo-historical ensembles. The attempt is anachronistic no less than utopian—the X-Files episode is titled Arcadia—or, even better put, dystopian, for this sort of “boycott” of history with its inevitable and ultimately salutary day-to-day conviviality, negotiations, and presences, this uneasiness about other faces, voices, accents, and income tax brackets, this selfishness-cum-siege mentality are bound to have socially dire consequences. Most basically, critics contend, these arrangements are unsustainable because they depend on the cities’ public services, for which the residents of gated communities are growingly unwilling to be taxed. The new developments in particular take away economic resources and, perhaps more importantly, interaction opportunities (“social synergies”) crucial to any city’s survival. But even if gated communities were economically viable, they still would not be able to achieve their quintessentially neo-segregationist objectives. The notion that the 21st-century outer world with its endless panoply of complications, ambiguities, problems, and shifts will somehow remain outside or that the gated world would magically regress into an outside (or before) temporality is a dangerous illusion. Your community, if it indeed feels like one, may be named so as to convoke an illo tempore and locus mirabilis, a time and place presumably prior and external to metropolitan chaos, crime, and pollution, hence the “olde,” British-sounding, rural-culture-referencing words and the agrarian-natural toponymy of “townes,” “villages,” “farms,” “acres,” “woods,” “meadows,” “creeks,” “lakes,” “pointes,” and “ridges.” But the rustic-naturist nomenclature is no more than a developer’s rather incondite ploy. This does not mean the ploy does not work. It has, and it will again. Its mythic appeal should not be underestimated. This does not mean either that many gated communities do not offer the clean and quiet streets the pamphlets promise. However, what they cannot deliver—and the evidence abounds—is life in the sociocultural vitro, in the ethical limbo of no larger duties and obligations where advertising literature also promise, if not in so many words, to put us up.
Area 51. Some 80 miles North of Las Vegas, Area 51 is a military base occupying a highly restricted, six-by-ten-mile rectangular tract of arid land in the middle of the Nellis Air Force Range between the salt flat of Groom Lake and the Nevada Test Site where, until 1992, the U. S. ran hundreds of nuclear experiments and where the Yucca Mountain underground radioactive waste repository was also to be located.
The polemics sparked by the Nevada nuclear storage are principally political. For the most part, they played out in the open, political arena of America’s traditional, deliberative-executive bodies and routines. A case in point was the project’s apparent finale: in early 2009, more then two decades after Congress had picked the site, the Obama administration pulled the funding and virtually “killed” the proposal, thus pleasing Nevada Democrat Harry M. Reid, who happened to be the Senate Majority Leader.[4] On the other hand, the controversies surrounding Area 51 and its military and research facilities date back to the Cold War. More significantly still, these disputes are primarily cultural and only subsequently political. Notwithstanding the political-institutional scope of the questions routinely raised in these arguments—who controls the place and on what juridical-legislative grounds, why people may not enter the zone and what exactly goes on in it, why those activities are kept secret and how they might affect us all—the debates have been carried mostly in less conventional, largely cultural arenas and discourses. Despite their sustained and incontestable involvement in the Area’s operations, U. S. government agencies have been conspicuously absent from this conversation, which has been shaped mainly in non-mainstream, “fringe,” “underground,” and subculture venues, circles, and representations from UFO and extraterrestrial life aficionados to conspiracy theorists of various kinds, one-world government believers, radical environmentalists, hi-tech hackers, and self-appointed watchdogs of all persuasions.
Quite simply, the Area is not supposed to exist, and it surely does not on any official maps. Nevertheless, “the most famously secret patch of real estate in the world” is real if, no less famously, off-limits.[5] The roads leading to it are practically paved with motion sensors.[6]Its land and airspace borders alike are strictly enforced even though it took the government decades to recognize, as it did a few years ago in response to an inquiry, that the U. S. Department of Defense does have an “operation location near Groom Dry Lake,” but the facility’s “activities and operations,” like all the other projects “conducted on the Nellis Range, both past and present, remain classified and cannot be discussed publicly.”[7] We know with a reasonable degree of certainty today that throughout the Cold War the Groom Lake base had been home to a whole series of Air Force experiments and tests, from those associated with the famous U-2 spy plane to the first stealth fighters, which, in all likelihood, many sightseers took for UFOs.
After the end of the Cold War, some of the top-secret work done here was moved elsewhere. However, the Area’s security has remained tight. Even more impenetrable has grown the overall mystery shrouding the site and, consequently, the public fascination with it. Fueling this fixation is pop culture’s lucrative obsession with the place, which has resulted in a host of TV series and blockbusters from Indiana Jones and Independence Day, in which the U. S. President has a tête-à-tête with an alien pilot, to the BlackSite Area 51 videogame, the animated movie Planet 51, and the 2009 film District 9. Set in South Africa, the latter references the Southern Nevada base and the rumors about it quite transparently, in a typical move that fleshes out fictionally the dominant tone and themes of a public discussion that has given pride of place to Roswell paraphernalia and alien encounters generally, time travel and its esoteric technology, the Illuminati and their progeny, and so forth.
Fiction, fabulation, and their brainchildren from the science-fictional to the paranormal to the paranoid to the outright wacky are the key here. Over time, their exalted rhetoric has had two intriguing and interrelated consequences. The first is something that could be called “fictionalization,” that is, the fictionalization of an otherwise real place about which the U. S. government has been so tight-lipped for reasons equally real and that, accordingly, beg legitimate questions. Online chat rooms and movies have managed, though, to fictionalize both the place and its meanings, thus providing the cover of the imaginary, the unreal, and the unverifiable to that which the whole discussion set out to expose, so much so that a recent commentator describes Area 51 as a “state of the mind,”[8]i.e., as a problem, if you will, of the observer—of the critic, the concerned taxpayer, the Nevada resident—not of the “thing” itself.
The second consequence has to do with America itself. If the Area is the inaccessible domain of the far-off and the implausible, the world outside is—“logic” tells us—plausible, reasonable. By contrast, the world inside is a black hole of sorts, a hiatus in the social regimen of rationality, commonsense, and recognized forms of inquiry, research, and deliberation. This socio-epistemological rift in how we “normally” act and what we “ordinarily” are stirs curiosity and provokes questions but only to suck them up, feeding on their irrational frenzy and returning no “reasonable” answers in exchange. Whatever it is, the Area remains both unfathomable and enigmatic. We do not know what it means, yet it presents itself as America’s inscrutable object or site par excellence; doomed to come back to us unanswered as they are, our questions and concerns should focus on it. For, as the same logic assures us, there is little if anything unreasonable, debatable, false, unjust, or restrictive about the world beyond the Groom Lake base and how that world sees itself. In this sense, Area 51 is a cultural-political decoy, a “false consciousness” ploy, as one might say with an older vocabulary. As such, the Area does something similar to what a French philosopher famously said of Disneyland: “just as prisons are there to conceal the fact that it is the social in its entirety, in its banal omnipresence, which is carceral,”[9] in playing up across pop culture and the media its own fictionality, controversial purpose, and “restricted” status, the Area implicitly downplays the often problematic, questionable character of “unrestricted” America. This is how, decades after the Cold War, the base still works, also in Jean Baudrillard’s words, as a “deterrence machine set up in order to rejuvenate in reverse the fiction of the real.” Opposite to “fictionalization,” to “rejuvenate” is here to “realify,” to treat “real America” just like that, as unquestionably real and really unquestionable, to “dissimulate” the social’s unreality, its complexity and, at times, outrageous arbitrariness—in short, to “conceal the fact that” what we take for normal, self-evident, and so on is rarely so.[10]
Levi’s. Levi’s jeans are the typical “anti-object.” As Barthes says, there are things, artifacts, for instance, that shed their symbolic aura over time. Deprived of their aesthetic or moral dimension, they exist solely in their “thingness.” Past a certain point, the material shell is all there is to them and not much else besides. The Levi’s purport to be just the opposite. They are the object that, its otherwise aggressive materiality notwithstanding, seems to aspire to the immaterial condition of a notion. The jeans may or may not “clothe the world,” as a Levi’s slogan boldly claims, but they have certainly tried to peddle an idea to this world by identifying with that idea and the ideology behind it with an intensity seldom matched in the history of corporate rhetoric.
A quick trip to the official Levi’s website goes to show that the Levi’s see themselves as an embodiment of the heroic values that have made America great. Like this or that beer—which, the brewer reassures you, is not “about” drinking (let alone getting drunk) but about friendship—Levi’s jeans are not about fashion (or anti-fashion for that matter), not even about clothing or “apparel”; they are about “moral values.” A sartorial or, more accurately, a textile allegory of America: this is what the pants have become since 1873, when Levi Strauss and Jacob Davis made the first pair, and this is how the product has been advertised quite transparently. This is, to be more exact, a moral allegory or ideal in which gender (male), race (white), and class (blue-collar) dovetail tightly, with the latter component rearticulating and subordinating all the others in its attempt to construct and capitalize on a certain idealized image of working-class America.
This is where the distinction between fashion (or the “sartorial”) and material (the “textile”) comes in. What the Levi’s may convey as style is far less important than what they suggest as texture or fabric. Here as, some say, in other aspects of life, America does not inhere in “form,” in a particular cut, exquisite or not (a trivial if not frivolous aspect), but in the cardinal features of the cloth: resistance, resilience, endurance, reliability, dependability, and the like. All these “typically American” ideals are not necessarily or exclusively working-class, but they have been articulated historically during the rise of the working-class in the second half of the 19th century and geographically in conjunction with its Westward expansion around the same time. It was also the time Strauss relocated to California and, with Davis, who would join him in San Francisco later, got a patent for the “riveted” technique.
The technique itself is significant: it was intended to reinforce the denim around the pocket corners and flyer bottom, in other words, to bolster the resistance of the fabric to the strains of manual labor in mines, in construction, on farms, and other workplaces like that (something the recent TV ad campaign of the Duluth Trading Company is definitely emulating). Strauss’s genius was commercial, for he gave the workers what they needed, but also ideological: he too got from them what he needed, that is, a coherently symbolic image: blue-collar or blue-jeans America, rather, the miner, the railway builder, then farm hand and, even more appealing, the cowboy in no less symbolic settings: the California of the gold rush, the frontier and the West, the prairies and the ranch. In the bargain, he also got the work ethic of rugged individualism, the hypermasculine glamour of consummate yet self-reliant effort, the quiet battle with oneself in the exacting loneliness of the plains. In the fabled 501, America becomes self-rewarding hardship and solitude, glorification of working-class heroism that extricates the “hero” from his “class,” renders the hero a loner, and the miner a gunslinger with a different type of peacemaker. Both figures end up speaking to the same ethic, to the same no-nonsense approach to life that, in order to triumph over things, boils them down to their essential, uncontroversial elements beyond “style” and “fancy.”
More importantly, both stand alone. They do come from a class, illustrate a whole category initially, but wind up symbolizing, first, heroic resistance outside associations and community, and later on, the renegade, the rebel. For several decades beginning with the 1950s and ending with the advent of “designer jeans” like American Eagle, Seven, Rock and Republic, Laguna Beach (let alone Polo Ralph Lauren or Kenneth Cole), Levi’s and, to a much lesser extent, competing brands such as Lee and Wrangler were the paragon of cool, which in turn was a synonym of Americanness, signifying as it did all the values and standards Strauss thought he had found in the American gold digger, in the pioneer, or the in Texas rancher.
However, as pointed out above, the way the brand was advertised quickly pulled this character out of its social context and affiliations with others in its class, “desocialized” it, and packaged this essentially generic figure into an individualist fantasy. This process picked up speed in the international arena, where the Levi’s quickly became, like other American products and lifestyles, an upper-crust marker. Not only is a pair of Levi’s far more expensive overseas and clearly beyond what working-class buyers can afford in countries outside Western Europe, but, like Coke and McDonald’s food, they are part of exclusive social rituals. Ironically enough, on global markets they have become the fashion, affluence, and economic status symbol Strauss thought they should never be.
Mac ’n’ Cheese. Critics have described macaroni and cheese—possibly the foremost American “comfort food”—as “emotionally significant.”[11] They could not be more right. And yet one cannot help wondering, what is the actual signification of the emotion involved? What does it mean exactly?
To put it succinctly, the dish is a symptom—perhaps the symptom—of American culinary nostalgia. More to the point, to many Americans, the casserole allows for a happy or at least “comforting” homecoming. Here, the return is presumably the return to the same, to childhood’s unadulterated sameness and its uncomplicated, “reasonable” contentment. Mac and cheese embodies one of America’s defining regressive fantasies, the gastronomic equivalent of a trip back to the womb, whose protecting ideal the family kitchen and mom’s cooking reenact subliminally and with long-lasting, psycho-cultural effects. In a sense, we eat mac and cheese to block out the “world out there” and its history perceived as a narrative of painful and extraneous “complications,” departures, and risks imposed on us after life’ symbolic pull takes us away from our early years’ coziness and weans us from their “simple,” homey pleasures.
We do not just consume the dish. We cuddle with it, or in it, as it were. After all, is the “elbow” macaroni not aptly shaped for this purpose? And has the food been compared to a “security blanket” for nothing? Mac and cheese provides indeed such “edible” blanket.[12] It is, in slightly altered shape, the tent we used to pitch in the parental home’s living room by the fireplace (variation on the kitchen fire theme), a taste of childhood and the childhood taste per se, in which we remain eager to immerse ourselves insofar as childhood—seized as an extension of infancy and of intrauterine life before it—predates and postpones history and with it social interaction, its pressures, and responsibilities.
Complexities, alternatives, choices, and the like may be hallmarks of adulthood but, alas, come at a price. We do pay this price as “responsible adults” but are never quite happy with the raw deal we get either as individuals or as a culture. A simplicity rite, mac and cheese symbolically rescinds the tradeoff, buys us the ticket back to the land of innocent, hassle-free plainness, to a time and place before complications and ambiguities. To be sure, it is not fast food; we do not wolf it down during the lunch break so we can rush back to the grind. It is much “slower,” in fact, a quasi-soporific dinner dish that does not prompt us to swing back into action but to take it easy, to take our time and time out of lucrative time. The casserole does not “fuel” but “heals” the wounds incurred during the bruising daily routine. It is not the adult equivalent of baby formula either. It is a babying formula, a putatively very “simple” and very “American” recipe enjoyed equally and with equalizing effects by parents and children alike. The yellowish goo of the cheese sauce base Americans eat with their macaroni is not only the denser avatar of the milk they were raised on as children. It also brings back childhood memories, as well as “retractile,” defensive fantasies.
This continuity is crucial. It is, on one level, a continuity with collective tradition, for this is a primary national dish alongside “the other mac,” McDonald’s ubiquitous burger—this one is a Mac (usually upper-case “M”) with cheese. Whether or not mac and cheese was invented by Thomas Jefferson, as some food historians argue, several American Presidents from Jefferson himself to Ronald Reagan served personal versions of it in the White House. What ensured its popularity was undoubtedly Kraft Foods’ introduction of the mac and cheese boxed dinner in the 1930s. One has good reason to believe that the unfortunate comeback at the end of the third millennium’s first decade of the “mac and cheese economy,” as a critic has put it, will not hurt the product either.[13] Easy to prepare, nutritious in an elementarily survivalist kind of way, plain in its structure, almost implying that down-to-earth, “artless” cooking is a virtue, the pasta dinner borders on the ascetically unadorned and remains suggestive of the basic needs and pleasures of life—of the nitty-gritty, barren, and uncomplicated as positive and pleasurable, as core American values consolidated in an exigent landscape of thrift at the ever-advancing frontier (for, to be sure, the frontier runs through our kitchens as well). Mac and cheese is “satisfying” in an austere sense, which the Kraft version took to a minimalist extreme. You can even do without milk. Only “add water,” and, within minutes, you will be ready to enjoy the stuff around the dinner table as much as in outer space (apparently, it is one of U. S. astronauts’ regular fares). Not as demanding as the more upper-class lasagna, mac and cheese is, on the other hand, less “sandwichy,” less “impersonal” (less ready-made) than pizza and yet as easy to share. Just about the right amount of cooking (“personal” participation) is usually involved, while a lot of innovations and modifications of the fundamental recipe are also possible whether you make it from scratch or rely on the convenient Kraft package. Either way, to most American palates the final product will be dependable, enjoyable, and conspicuously redolent of the original combination, and with it, of our infancy’s pre-lapsarian atmosphere the latter conjures up.
It hardly matters, of course, that this world prior to our “fall” into adulthood, into its sophistications and responsibilities, is a retroactive fiction, that is, it has never been as simple and carefree as we reconstruct it. Neither childhood nor its staple dish is simple, “unsophisticated,” without complications, hesitations, variations, and histories. We are, however, very good at simplifying things in hindsight, at forgetting their historicity and heterogeneity by excising the outside, the exogenous, the unknown, and their tensions, which are so much part and parcel of both our childhood and of the sounds, colors, and flavors forever bound up with it. Much as Jefferson, we are told, brought a pasta-making device back from Europe and used American cheese to “Americanize” an otherwise essentially foreign food—which, scholars also point out, had probably been made first in Asia either by the Chinese or the Arabs[14]—our early age is fraught with history, conflict, and problems. Like our favorite foods, for which we usually develop a taste as kids, our early age is “authentic,” and there is nothing preventing us from calling both our first years in life as well as mac and cheese “authentically American.” It is just that their authenticity—surely real—and authenticity in general are not how we usually imagine them: simple, of one piece and one kind, local, uncomplicated in makeup and birthplace. More often than not, the authentic is inherently worldly, a matter of imports, adaptations, and combinations. The genuine is—only if we looked just a bit more closely—a mix and a remake. That is, the genuine is never perfect; only its copy is, as such, as a copy. And, as en elaboration on ingredients, recipes, and techniques from all over the world, it has little to do with the cozy, homey, unalloyed, and quasi-“localist” values customarily associated with a food in which, whether we know it or not, we get a taste of the world.
Paper or Plastic. You have checked all items off your list, and your cart is full now. You have gotten everything: your skim milk, your granola bars, your Egg Beaters (no more real eggs for you), your salmon steaks (you made sure the thing was wild-caught), and, yes, your tofu. You have not forgotten the veggie burgers and the floss you like either, as you did last time. You have made all the right although admittedly tough decisions. To top it all off, you have answered the cashier’s somewhat abrupt question (“Credit or debit?”) with the amount of confidence that seemed just about right. It is not over until it is over, though, for now comes the real test: “Paper or plastic,” asks the teenage grocery bagger.
Frankly, this is the one you dread. It is predictable—predictability itself, in fact, sheer return of the same underwritten, alas, by no Nietzschean amor fati. And it is lame too. Ever since the plastic bags were introduced in the 1960s, the supermarket version of the Hamletian dilemma has been the most frequently asked question across America and so, in an important if not immediately apparent sense, a question about America, possibly the American question. The more you think about it, the more you see the Dane’s point. Indeed, it seems that, no more than a “pale cast of thought” over whatever we may think of “enterprising” (in) “the name of action,” “[t]hus conscience does make cowards of us all.”
For, to be sure, the interrogation, before being one, is a response to a reality that is beyond question and, as such, calls for immediate action: more than half of the solid waste we produce, and so a considerable part of our ecological woes, comes from packaging, with cash register packing a significant portion of it. Concerned consumers appeared to be quite swayed by reports pointing, with ample display of visual documentation, to the long-lasting damage done to the environment by plastic bags and plastic in general (far less biodegradable and harder to recycle) until they realized that, according to other reports, the more eco-friendly paper bags took more energy (hence more oil) and, first and foremost, lots of trees to make. A whole tug of war between paper and plastic packaging manufacturers followed, with environmentalists and recycling companies joining the melee only to compound the debate. Plastic bags, one side has argued, are both hard and expensive to recycle, and they must be pulled out from commingled trash manually. It has been objected, however, that paper seems less versatile and costs more; some supermarkets charge you for it, while Wal-Mart does not even bother to offer the paper option. Then there is, of course, the “cultural argument” of “overpackaging” of products that in other countries are just not packed at all or of goods that, in the U. S. and elsewhere, are already individually packed, or of items packed in boxes, bags, and bottles (vitamins, for instance) conspicuously larger than their content (and unjustifiably so). There is also the still “experimental” solution of reuse of the classical supermarket bags of both paper and plastic or of paper only, as there is the fancier alternative of sturdy canvas bags. One should not forget Costco’s “sensible” approach either (customers are provided with the carton boxes in which the merchandise has been shipped to the store, which boxes are themselves reusable).
The debate is not irrelevant but superficial etymologically speaking, that is, scratches on the surface of things and, in so doing, leaves a whole host of other questions unasked. These questions come—and therefore should be posed—before the retail drama and regard production, circulation, and distribution of goods an values, of the means to make things as much as to store, sell, and acquire them, the ethics and politics of manufacturing and consumption and not just their ecology. Failing to raise such issues almost makes “paper of plastic?” moot, a pseudo- or non-interrogation. The question does have an object but its answer—whatever it is—answers nothing because in and of itself, no matter how “enlightened” it may be, it makes no difference as long as we do not ask the other questions. The choice the question implies is thus also illusory, inconsequential—not choice itself but its deceptive myth; it is the spectacle of choice or, even better, choice as theoretical spectacle. In this sense, the most common question we ask ourselves—American at its most self-scrutinizing, if you will—implies that, deep down, we want to change neither our supermarkets nor ourselves.
General Bibliography
Althusser, Louis. “Ideology and Ideological Status Apparatuses (Notes Towards an Investigation).” In Lenin and Philosophy and Other Essays. Translated from the French by Ben Brewster. Introduction by Fredric Jameson. New York: Monthly Review Press, 2001. 85-126.
Appadurai, Arjun, ed. The Social Life of Things: Commodities in Cultural Perspective. Cambridge, UK: Cambridge University Press, 1986.
Appadurai, Arjun. “Introduction: Commodities and the Politics of Value.” Appadurai, The Social Life of Things. 3-63.
Attfield, Judy. Wild Things: The Material Culture of Everyday Life. New York: Oxford University Press, 2000.
Barthes, Roland. Mythologies. Paris: Seuil, 1957.
——. Mythologies. Selected and translated from the French by Annette Lavers. New York: Noonday, 1993.
——. The Eiffel Tower and Other Mythologies. Translated by Richard Howard. Berkeley, CA: University of California Press, 1997.
Baudrillard, Jean. Selected Writings. Edited, with an Introduction by Mark Poster. Stanford, CA: Stanford University Press,1996.
——. America. Translated by Chris Turner. London: Verso, 1996.
——.The System of Objects. Translated by James Benedict. London: Verso, 2005.
——. The Consumer Society: Myths and Structure. Translated by Chris Turner. London, UK: Sage, 2008.
——.Fatal Strategies. Introduction by Dominic Pettman. Translated by Philippe Beichtman and
W. G. J. Niesluchowski. Los Angeles: Semiotext(e), 2008.
Bell, Michael. “Myth in the Age of the World View.” In Literature, Modernism and Myth: Belief and Responsibility in the Twentieth Century. Cambridge, UK: Cambridge University Press, 1997. 9-40.
Bennett, Tony, Lawrence Grossberg, and Meaghan Morris, eds. New Keywords: A Revised Vocabulary of Culture and Society. Malden, MA: Blackwell, 2005.
Berger, Arthur Asa. What Objects Mean: An Introduction to Material Culture. Walnut Creek, CA: Left Coast Press, 2009.
Berger, John. Ways of Seeing. New York: Viking Press, 1973.
Bourdieu, Pierre. Outline of a Theory of Practice. Translated by Richard Nice. Cambridge, UK: Cambridge University Press, 2003.
Brown, Bill. A Sense of Things: The Object Matter of American Literature. Chicago, IL: University of Chicago Press, 2003.
Calhoun, Craig, and Richard Sennett, eds. Practicing Culture. London: Routledge, 2007.
Cassirer, Ernst. Language and Myth. New York: Dover, 1953.
Certeau, Michel de. The Practice of Everyday Life. Translated by Steven F. Rendall. Berkeley, CA : University of California Press, 1984.
Clifford, James. The Predicament of Culture: Twentieth-Century Ethnography, Literature, and Art. Cambridge, MA: Harvard University Press, 1988.
Cook, James W. The Arts of Deception: Playing with Fraud in the Age of Barnum.Cambridge,MA: Harvard University Press, 2001.
Debord, Guy. The Society of the Spectacle. Translated by Donald Nicholson-Smith. New York: Zone Books, 1995.
Debray, Régis. Transmitting Culture. Translated by Eric Rauth. New York: Columbia University Press, 2000.
DeKoven, Marianne. “Modern Mass to Postmodern Popular in Barthes’s Mythologies.” Raritan 18, no. 2 (Fall 1998): 81-98.
Derrida, Jacques. “Structure, Sign, and Play in the Discourse of the Human Sciences.” In Lodge, Modern Criticism and Theory, 210-224.
Doniger, Wendy. The Implied Spider: Politics and the Theology of Myth. New York: Columbia University Press,1999.
Doty, William G. Myth: A Handbook. Westport, CT: Greenwood Press, 2004.
Doueihi, Milad. Earthly Paradise: Myths and Philosophies. Translated by Jane Maries Todd. Cambridge, MA: Harvard University Press, 2009.
Durand, Gilbert. Les structures anthropologiques de l’imaginaire. Introduction à l’archétypologie générale. Paris: PUF, 1963.
——. Le Décor mythique de la Chartreuse de Parme. Paris: José Corti, 1971.
——. Introduction à la mythodologie. Mythes et sociétés. Paris: Albin Michel, 1996.
Eco, Umberto. Travels in Hyperreality: Essays. Translated from the Italian by William Weaver. San Diego: Harcourt Brace Jovanovich, 1986.
——. How to Travel with a Salmon & Other Essays. Translated from the Italian by William Weaver. New York: Harcourt Brace & Company, 1994.
Eliade, Mircea: Myth and Reality. Translated by Willard R. Trask. New York: Harper and Row, 1963.
Fiske, John. Understanding Popular Culture. London: Routledge, 1987.
Fitting, Peter. “To Read the World: Barthes’s Mythologies Thirty Years Later.” Queen’s Quarterly 95, no. 2 (Winter 1988): 857-871.
Flaubert, Gustave. Le Dictionnaire des Idées reçues. Oeuvres, II. Paris: Gallimard, 1952, 999-1028.
Frow, John. Accounting for Tastes: Australian Everyday Cultures (with Tony Bennett and Michael Emmison). Cambridge, UK: Cambridge University Press, 1999.
Frye, Northrop. Anatomy of Criticism. Princeton, NJ: Princeton University Press, 1957.
Funès, Julia de. Coup de philo . . . sur les idées reçues. Illustrations by Nadège Duruflé. Paris: Michel Lafon, 2010.
Garcin, Jérôme, ed. Nouvelles Mythologies. Paris, Seuil, 2007.
Grant, Colin. Myths We Live By. Ottawa, ON: University of Ottawa Press, 1998.
Guins, Raiford, and Fiona Candlin, eds. The Object Reader. London: Routledge, 2008.
Hall, Stuart, ed. Representation: Cultural Representations and Signifying Practices. London: Sage and Open University, 1997.
Hilfiger, Tommy, and George Lois. Iconic America: A Roller-Coaster Ride through the Eye-Popping Panorama of American Pop Culture. New York: Universe, 2007.
Johnson, Barbara. Persons and Things. Cambridge, MA: Harvard University Press, 2008.
Jameson, Fredric. The Political Unconscious: Narrative as a Socially Symbolic Act. Ithaca, NY: Cornell University Press, 1994.
Jones, Ray. USA to Z: A Celebration of American Popular Culture. Nashville, TN: Cumberland House Publishing, 2004.
Jung, Carl-Gustav. The Archetypes and the Collective Unconscious. Collected Works. Vol. 9. New York: Pantheon, 1959.
Kolodny, Annette. The Land Before Her: Fantasy and Experience at the American Frontiers, 1630-1860. Chapel Hill, NC: The University of North Carolina Press, 1984.
Latour, Bruno. Reassembling the Social: An Introduction to Actor-Network-Theory. Oxford, UK: Oxford University Press, 2007.
Lears, T. Jackson. “The Concept of Cultural Hegemony: Problems and Possibilities” American Historical Review 1985, 90 (3): 567-593.
Lears, T. Jackson and Richard Wightman Fox, eds. The Power of Culture: Critical Essays in American History. Chicago, IL: University of Chicago Press, 1993.
Lewis, R.W.B. The American Adam: Innocence, Tragedy, and Tradition in the Nineteenth Century. Chicago, IL: University of Chicago Press, 1984.
Lincoln, Bruce. Theorizing Myth: Narrative, Ideology, and Scholarship. Chicago, IL: Chicago University Press, 1999.
Lipovetsky, Gilles. Hypermodern Times. Translated by Andrew Brown. Cambridge, UK: Polity, 2005.
Lodge, David, with Nigel Wood. Modern Criticism and Theory: A Reader. Edited by David Lodge. 3rd ed. Harlow, UK: Longman, 2008.
Maffesoli, Michel. Iconologies. Nos idol@tries postmodernes. Paris: Albin Michel, 2008.
Mansour, David. From Abba to Zoom: A Pop Culture Encyclopedia of the Late 20th Century. Kansas City, MO: Andrews McMeel Publishing, 2005.
Marderness, William. How to Read a Myth. Amherst, NY: Humanity Books, 2009.
Marx, Leo: The Machine in the Garden: Technology and the Pastoral Ideal in America. Oxford, UK: Oxford University Press, 2000.
Mitroff, Ian I., and Warren Bennis. The Unreality Industry: The Deliberate Manufacturing of s Falsehood and What It Is Doing to Our Lives. New York: Oxford University Press, 1993.
Mitchell, W.J.T., and Mark B.N. Hansen, eds. Critical Terms for Media Studies: Chicago, IL: University of Chicago Press, 2010.
Morris, Evan. From Altoids to Zima: The Surprising Stories behind 125 Famous Brand Names. Los Angeles: Fireside, 2004.
Nancy, Jean-Luc. The Inoperative Community. Edited by Peter Conor. Translated by Peter Conor, Lisa Garbus, Michael Holland, and Simona Sawhney. Foreword by Christopher Fynsk. Minneapolis, MN: University of Minnesota Press, 1991.
Paterson, Mark. Consumption and Everyday Life. London: Routledge, 2005.
Plate, Liedeke. “Mythical Returns.” Transforming Memories in Contemporary Women’s
Rewriting. New York: Palgrave Macmillan, 2011.
Ross, Andrew. No Respect: Intellectuals & Popular Culture. New York: Routledge, 1989.
Saito, Yuriko. Everyday Aesthetics. New York: Oxford University Press, 2008.
Saper, Craig. Artificial Mythologies: A Guide to Cultural Invention. Minneapolis, MN: University of Minnesota Press, 1997.
Segal, Robert A. Myth: A Very Short Introduction. Oxford: Oxford University Press, 2004.
Smith, Douglas, ed. “Mythologies at 50: Barthes and Popular Culture.” Nottingham French Studies 47, no. 2 (Summer 2008): 1-85.
Smith, Henry Nash. Virgin Land: The American West as Symbol and Myth. Cambridge, MA: Harvard University Press, 1970.
Strauss, Claude-Lévi. The Savage Mind. Chicago, IL: The University of Chicago Press, 1968.
Trachtenberg, Alan. “Myth, History, and Literature in Virgin Land.” Prospects 3 (1977): 127-129.
Ungar, Steven. “From Even to Memory Site: Thoughts on Rereading Mythologies.” Nottingham French Studies 36, no. 1 (Spring 1997): 24-33.
Veeser, Aram H. ed. The New Historicism. New York: Routledge, 1989.
Williams, Raymond. Keywords: A Vocabulary of Culture and Society. New York: Oxford University Press, 1985.
Wolfe, Cary. What Is Posthumanism? Minneapolis, MN: University of Minnesota Press, 2009.
Woodward, Ian. Understanding Material Culture. London: Sage, 2007.
Notes
[1] This introductory part is the outcome of a collaboration with my colleague Phyllis W. Hunter. The glossary samples below, which I have authored, have also benefited from Professor Hunter’s feedback (Christian Moraru).
[2] George J. Church, “Go Ahead—Make My Day.” Time, Sunday, June 24, 2001. http://www.time.com/time/magazine/article/0,9171,141371,00.html (accessed July 1, 2009).
[3] Edward J. Blakely and Mary Gail Snyder, Fortress America: Gated Communities in the United States (Washington, D. C.: Brookings Institution Press; Cambridge, MA: Lincoln Institute of Land Policy, 1997), 7. “Gated Communities More Popular, and not Just for the Rich,” is the title of the article published by Haya El Nasser in USA Today on December 15, 2002. http://usatoday.printthis.clickability.com/pt/cpt?act-ion=U…day.com%2Fnews-%2Fnation%2F2002-12-15-gated-usat_x.htm&partnerID=1660 (accessed July 8, 2009).
[4] “Mountain of Trouble: Mr. Obama defunds the nuclear repository at Yucca Mountain. Now what?” editorial, The Washington Post, March 8, 2009, A18, http://www.washingtonpost.com/wp-dyn/content/article/2009/03/07/AR2009030701666.html (accessed July 27, 2009)
[5] Kevin Poulsen, “Area 51 Hackers Dig Up Trouble,” Security Focus, http://www.securityfocus.com/news/8768 (accessed July 26, 2009).
[7] “Area 51,” Department of the Air Force letter dated “Aug. 1998,” http://upload.wikimedia.org/wikipedia/commons/0/06/Usaf_on_area51.png (accessed July 26, 2009).
[8] Glenn Cambell, “Area 51: Military Facility, Social Phenomenon and State of Mind,” http://www.ufomind.com/area51/.
[9] Jean Baudrillard, Simulacra and Simulations, in Selected Writings, translated by Jacques Mourrain, edited and introduced by Mark Poster (Stanford, CA: Stanford University Press, 1996), 172.
[12] David W. Cowles, “Comfort Food,” http://thecowlesreport.com/fastestchef-028.html (accessed July 6, 2009).