Mark Dery’s Visions of American Dread, American Dreams
In April 2012, I reviewed cultural critic Mark Dery’s latest essay collection, I Must Not Think Bad Thoughts: Drive-By Essays on American Dread, Amerian Dreams, for The Verge. While reading, I corresponded with Dery via email to discuss the dizzying new array of cultural memes and altered realities tackled in the book—from the growing national fervor over 2012 as mankind’s endpoint, a study of Mark Twain’s dark side, and the rigidly defined masculinity of the American male, to examining the suicide note as a literary genre and exploring the obscure pleasures hidden in medical libraries. No topic is too sacred, macabre, arcane, or even outlandish. Included below is the transcript of our interview.
One of the linchpins of your latest book is the idea that, in America, chaos has become commonplace—a point that you illuminate from a different angle in each chapter. I’m curious, though: just as national politics have veered further to the right over the last 50 years, what do you imagine “normal” American life will look like in the not-so-distant future?
I never know quite how to answer this sort of question, which seems to require a cross between Carnac-the-Magnificent mentalism and neo-Marxist Jeremiad in the Mike Davis mode.
Truth to tell, I’m uncomfortable in the visionary role. I think of myself as an archaeologist of the future present, excavating the last five minutes. (In his introduction to the book, Bruce Sterling calls me the guy “who predicted the past.” I like that.)
The SF novelist J.G. Ballard, whom I read as a postmodern philosopher, believed that the future was being annexed by the ever-encroaching present. In a world where terrorist cabals have PR departments, cubicle warriors remotely pilot predator drones, and we digest, over breakfast, headlines about face transplants and bioengineered pigs with human hemoglobin, Ballard’s claim that the world around us is increasingly indistinguishable from science fiction, and that it is therefore the novelist’s job to invent reality, makes Surrealist sense.
The essays in I Must Not Think Bad Thoughts apply Ballard’s logic to the cultural-studies approach of reading culture as if it were a text, pushing the envelope of personal essay, comic sociology, snark-monkey polemic, and drive-by deconstruction (Barthes in Mythologies, Eco in Travels in Hyperreality, Baudrillard in America, Zizek in everything) to the point where it starts shaking its rivets loose.
Frankly, I wish I’d had the intellectual courage to really open up the throttle, shattering the mind barrier between cultural criticism and speculative fiction as Steven Shaviro does in his book of “theoretical fictions,” Doom Patrols, or as Geoff Manaugh does on his website BLDGBLOG, in “architectural fictions” that mash up postmodern philosophy, visionary urbanism, and Ballardian SF.
Technological speedup, information overload, and accelerating social change have given us a culture whipsawed by Net memes and media viruses, subcultural manias and popular delusions; in the rapidly contracting space between one media event and the next, few public intellectuals can swallow the deluge of information and spit out anything profound. Since the world around the corner is already a receding speck in McLuhan’s rear-view mirror by the time we can think of anything useful to say about it, I believe it’s the cultural critic’s job not to predict the future but to write a kind of science fiction of the here and now — a post-disciplinary, genetic-chimera kind of criticism that uses every theoretical and literary tool it can lay its hands on to make sense of a world that feels, more and more, like Tomorrow Now.
In the book’s introduction, you make a pointed note about American consciousness and the growing gap between fantasy and reality (i.e., “the distance between the dream of ourselves and the face staring back at us from the cultural mirror.”). Do you really believe our collective delusions are at an all-time high, or are we just more aware of everyone’s innermost thoughts thanks to the endless streams of information at our disposal?
Neither, really. The passage in question has to do with the yawning chasm between the utopian promise of the American Experiment, on one hand, and on the other our sterling résumé as the standard-bearer of democracy—an unblemished record of institutionalized bigotry, genocidal expansionism, happy complicity with brutal regimes, imperialist adventuring, corporate buyout of elected officials, government surveillance and corporate data-mining insinuated into our private lives, the fulsome worship of the Rich and Famous coupled with a brazen contempt for the poor and the disenfranchised, and, blotching every page of our history, the anti-intellectualism, nativist no-nothingism, small-town Babbittry, and religiously sanctioned intolerance and illiteracy that threaten to turn Our Fair Republic into a Land of the Yahoos, populated by Darwin deniers, global-warming skeptics, Obama birthers, 9/11 truthers, right-to-life terrorists, dug-in survivalists, and the theocratic crusaders of the religious right.
So if by “our collective delusions” you mean the distance between star-spangled fantasies of U.S.A. Number One, land of the free, home of the brave, and everyday reality in contemporary America, I do think that distance yawns wide.
But if you’re referring to some sort of ontological/epistemological rupture between the real and the virtual, a la The Matrix — a theme I played improvisations on throughout my first two books (Escape Velocity and The Pyrotechnic Insanitarium) — well, yes, I also believe that there’s a mounting tension between the self-evident truth of our embodiment (specifically, our evolutionarily engineered neurocognitive machinery) and the equally inescapable fact that we’re spending more and more of our lives out of our bodies, so to speak, psychologically immersed in worlds on the other side of the screen (social media, electronic entertainment, work-related idea-juggling or number-crunching).
The widening of the Cartesian mind-body split has profound implications on a societal scale and in our everyday lives, where people often seem to be elsewhere, even in social situations—texting, talking on the phone, surfing the Web, playing handheld games, cellphone-photographing and foodblogging their meals while the food gets cold, Tweeting and Tumblr-ing the moments of their lives rather than experiencing them in the moment, unmediated by anything other than our senses and our sensibilities.
But the flipside of what you’re calling “our collective delusions” is the collective action enabled by the social Web, and by the Internet’s ability to vault over Chinese walls and route around stupidity. WikiLeaks, the hacktivists behind Anonymous, Dan Savage’s “It Gets Better” campaign against homophobia (and his savagely funny hack of Google searches for the keyword “Santorum”), the role of social media in the Arab Spring, the Web-enabled tsunami of citizen outrage that swept away the ill-conceived Stop Online Piracy Act (SOPA): there’s no question that the Web can be a powerful force for grassroots activism, bypassing the timorous corporate newsmedia and sclerotic mainstream politics to create real social change.
Of the collective actions you’ve mentioned — SOPA protests, Dan Savage’s “It Gets Better” campaign, the hacktivism of Anonymous, etc. — can the Web alone sustain such movements and beliefs for prolonged periods of time? Or is real world, boots-on-the-ground intervention, as evidenced with #OWS, a necessity?
The viral outrage of the SOPA/PIPA protests, the wrenching oral histories of “It Gets Better” (Studs Terkel meets Larry Kramer?), the radical transparency of WikiLeaks, and the hive-mind hacktivism of Anonymous, especially its harrying of the Church of Scientology, are object lessons in what the Web does best: facilitate the kudzu-like spread of grassroots movements and what sixties activists liked to call direct action; knit together fellow travelers, no matter how far-flung, empowering them with a sense of collective identity and strength in numbers; goad the celebrity-obsessed, horse race-flogging mainstream media into reporting stories that really matter; and route around the charade of politics as we know it, a wholly owned subsidiary of corporate lobbyists and ideological-wingnut donors like the Koch brothers.
At the same time, the Occupy movement reminds us that, contrary to popular belief in postmodern-theory circles, the streets are not “dead capital.” Occupy is a wake-up call about the corporate enclosure of the commons, both offline, in “privately owned public spaces” like Zucotti Park, and online, in the new privatized agoras of Facebook and Twitter and Tumblr and Flickr, where our social lives are data-mined and trend-spotted by our accommodating hosts. As well, it rubs our noses in the fact that, our digital disembodiment notwithstanding, the politics of bodies still matter. Even in a Matrix world, we can still be “kettled” and pepper-sprayed by cops; “life”-affirming zealots gun down abortion providers; the religious right presses on with its legislative crusade to make the Family Resource Council’s idea of Old-Testament morality the sharia law of the land, denying women reproductive control of their bodies and contraceptive control of their sexuality.
Executive Summary: I think Web-enabled activism and boots-on-the-ground direct action are at their most effective when they’re synergistic. Social media is the nervous system, and the CNN, of political protest, from the Arab Spring to Zucotti Park. Street protest is still a symbolic gesture with the power to cut through the media spectacle and maybe even change minds, especially if it’s captured by in a cellphone snapshot and broadcast across the Web. If you’re dreaming of revolution, it’s always nice to bring your body to the party; you need something to throw on the barricades.
As much as these essays shine a light on the nation’s dark undercurrents, whether its gun violence, the absurdity of pop icons, or otherwise, the writing is also laced with threads of humor. It’s as if there’s a winking acknowledgment that we’re all guilty of the overindulgence Americans have become known for.
Either that, or maybe just a kick, under the table, to social satirists like H.L. Mencken, William S. Burroughs (especially the Burroughs of the mordant “Thanksgiving Prayer”), Twain (of Huckleberry Finn’s darker passages), the Hitchens who rejoiced at Jerry Falwell’s death (best. atheist.zinger. ever: “if you gave Falwell an enema, he could be buried in a matchbox.”). I do believe humor is a distinguishing characteristic of intelligence, as well as the fixed bayonet of effective polemic. I debate my friends on the academic left about this, all the time. The left, in America, needs more Lenny Bruces in the same way that classical music, to quote Erik Satie, needs less sauerkraut. Emma Goldman had it just about right in her quote on revolution and dancing. One might add humor. To touch on a point dear to my wizened heart, the chloroform prose and humorectomy-at-birth dourness of the Socialist Worker left—as opposed to, say, Cockburn at his best or Hitchens before he cut cards with the neocon devil—dooms it to political impotence and cultural irrelevance. Is it any accident that Stephen Colbert and Jon Stewart are more profoundly influential in shaping public opinion than everyone on the left (except maybe Noam Chomsky) or, for that matter, most beltway pundits?
The book’s title, I Must Not Think Bad Thoughts, is derived from a song of the same name by Los Angeles punk band X. But you also suggest that to “think bad thoughts,” or plunge into society’s taboo or undesirable topics as a means of deep and necessary investigation, is a writer’s mandate.
Not too self-aggrandizing a statement!
But seriously: I do believe it’s the political business of the cultural critic to poke the sharp end of his pen into the buried truths and dirty secrets, fringe subcultures and borderline personalities that often reveal more about the American scene and our cultural psyche than the mainstream does. Of course, I write in the shadow of the American Gothic, which owes much to an unforgettable encounter, at too early an age, with Hawthorne’s “Young Goodman Brown” and Melville’s Ahab raving (long before Baudrillard, we should note) that “all visible objects are but as pasteboard masks.” Sunday school drummed home the notion that deeper truths lie behind the smoked glass of our benighted perception, through which we see darkly; Watergate sharpened that point, leaving the inescapable impression that most concealed truths are dark ones. John Wayne Gacy, L.A. punk, Didion’s White Album, Hitchcock’s Shadow of a Doubt, and David Lynch’s Blue Velvet opened the trap door to crawlspaces in the American unconscious, dark rides whose tableaux are at least as reflective of who we are as those dreamed up by Spielberg and Disney. As well, my ’60s childhood and ’70s adolescence might have had something to do with my embrace of the American Gothic as aesthetic sensibility and philosophical worldview; those times, especially the have-a-nice-daymare ’70s, were a catechism of cynicism: the murder of JFK, Martin, and Malcolm, of course, but closer to home, in the California where I grew up, the Manson killings, the Zodiac Killer, the Hillside Strangler. Then, too, I suppose the left-wing cultural critiques of writers like Gore Vidal, hit pieces by New Journalists like Hunter S. Thompson, the kill-your-idols school of rock criticism (exemplified by Lester Bangs), radical media criticism such as Chomsky and Herman’s Manufacturing Consent, the insurgent social theory of renegade intellectuals such as Guy Debord, critical theory (Barthes’s Mythologies, Baudrillard’s America), the deadpan social satire of SF novelists like Philip K. Dick and J.G. Ballard, and even critically neglected masterpieces of crap culture like John Carpenter’s They Live (recently redeemed by Jonathan Lethem, I’m happy to note) reinforced the notion that Things Are Not as They Appear, and that it’s a critic’s job to punch through the false fronts of official narratives, conventional wisdom, public images, even consensus reality itself.
Hasn’t the idea of Things Are Not as They Appear taken on a whole new meaning now? What I mean is, today, more than ever, our ongoing narratives, conventional wisdom, public images, and reality is open to manipulation via digital means.
Well, the stories a culture tells itself have always been “open to manipulation,” haven’t they? In fact, cultural narratives—ideological myths, in Barthesian terms—are a form of manipulation. By “ongoing narratives,” I assume you’re talking about viral news stories and media viruses, as well as the thick coating of warring interpretations (popular readings as well as expert spin) that they pick up as they roll around the culture? Or maybe you’re referring to the subterranean river of myth running beneath the surface of everyday life—Birth of a Nation origin stories, blood-and-soil fictions of exceptionalism and destiny, revisionist accounts of a Disneyfied America before the Loss of Innocence, Wired dreams of a Great Big Beautiful Tomorrow. Either way, the dominant voices in society have been hijacking cultural narratives and repurposing them to their own ends since at least the beginning of modernity, long before the Digital Age.
I play theme-and-variations on these points in I Must Not Think Bad Thoughts, in the essay “Triumph of the Shill,” about the Nazis’ enduring influence as pioneers of branding:
Hitler, like it or not, had an intuitive grasp of the semiotics of power, evidenced not only in his appropriation of the swastika and re-branding of the ragtag National Socialist movement, but in his racist stereotyping—[the design critic Steven] Heller calls it “branding demonization”—of the German Jews and, ultimately, in the forced tattooing that marked death-camp inmates for slaughter—branding in the most horrifically literal sense. “Twenty years before Madison Avenue embarked upon ‘Motivational Research,’” Aldous Huxley observed, in 1958, “Hitler was systematically exploring and exploiting the secret fears and hopes, the cravings, anxieties and frustrations of the German masses.”
Then, too, public-relations experts such as Ivy Lee, Edward Bernays, and Walter Lippmann manufactured mass consent for elite agendas at a time when the computer was barely a gleam in John von Neumann’s eye.
That said, you’re right that the digitization of virtually everything plays hell with all the old epistemological givens, making the public record more manipulable than ever: Photoshop is to the image world what genetic engineering is to DNA. David King’s extraordinary study of the airbrushing of reality in Stalinist Russia, The Commissar Vanishes: The Falsification of Photographs and Art in Stalin’s Russia, is a useful reminder that selective amnesia didn’t begin with Photoshop. Orwell warned us that who controls the present controls the past. But a wraparound media reality built out of bits makes that present more rewritable than he ever imagined. And more fleetingly ephemeral: I’m talking WikiWars, articles in Papers of Record revised without mention, and, since the Encyclopedia Britannica just announced its last print edition, the disconcerting disappearance of a Gutenbergian back-up memory, in case the power fails and we suddenly find ourselves in “The Machine Stops” by E. M. Forster. Who would’ve thought that the errors in the fossil record of our times were as important as Wikipedia’s much-trumpeted ability to instantly correct that record? The Britannica’s cultural role as a sedimentary record of received truths, self-serving hypocrisies, cultural blind spots, and the like—a kind of meta-record—turns out to be very nearly as important as its role, an age ago, as donnish arbiter, alongside the O.E.D., of the Official Record.
When you look around today, and as you continue to write about our increasingly networked culture, does the idea that reality has become “indistinguishable from science fiction” seem any less surreal?
It never did seem surreal. Ballard was, and remains, the implacable nightly news anchorman for a world that has become “a map in search of a territory,” as he put it in The Atrocity Exhibition. In that I.E.D. of a novel, he writes, “A huge volume of sensational and often toxic imagery inundates our minds, much of it fictional in content. … In the waking dream that now constitutes everyday reality, images of a blood-spattered widow, the chromium trim of a limousine windshield, the stylized glamour of a motorcade, fuse together to provide a secondary narrative with very different meanings.” That impresses me as a perfectly matter-of-fact account of the stream-of-media-consciousness world we live in, where media saturation and societal speedup are making the dream logic of Surrealism—the proverbial chance meeting of a sewing machine and an umbrella on a dissecting table—an everyday reality.
In the 20th century, the collage was a metaphor—the avant garde’s way of miming, as McLuhan would say, the information overload and whirlagig acceleration of the emerging mass-media society. Now, when we spend much of our imaginative lives immersed in media narratives and online social worlds, we experience reality as a collage, a posthuman narrative written by the media, for the media. McLuhan argued that technology is a mirror in which we see those aspects of ourselves—our behaviors, our abilities—that we have offloaded onto our machines, and, like Narcissus, are enchanted by them. Yet today’s technologies talk, increasingly, amongst themselves. (Think of the superfast, artificially intelligent stock-trading programs that took Wall Street on a gut-lurching plunge in 2010, guided by a machine logic that had little to do with human needs, obliterating an estimated $1 trillion in market capitalization before the market pulled out of its death dive.) The mirror regards itself, and likes what it sees. At least one of McLuhan’s predictions came true: we are, in turns out, “sex organs of the machine world,” pollinating technological evolution.
Originally published on The Verge in April of 2012.