The Collapse of Reality
Mass entertainment, technological advance, and the lure of the internet have brought us to a place we’ve never been before.
The conventional wisdom holds that conspiracy theories have become far more deeply embedded in American political culture of late. Doubtless this is true: One need only recall widespread moral panics like the satanic child-abuse cabals of the 1980s, the assorted Clinton-focused paranoias of the 1990s, the 9/11 and birther conspiracies of the 2000s, and now QAnon in the 2020s. These, moreover, are only the highlights of many dozens of lesser cases. With each successive wave of lurid fantasy, the number of adherents seems to have grown.
Of course, many Americans have held one or another outlandish belief over the past three centuries, so the existence of discernable lava flows of irrationality is not exactly new. The denizens of Salem, Massachusetts, believed in witchcraft in the late 17th century—and proved it by burning witches. The Anti-Masonic and Know Nothing parties of the pre-Civil War era testified to similar if less murderous inclinations. We have a literature on all of this, not least a seminal essay on the paranoid style in American politics.
We have even experienced, long before Donald Trump showed up, what Max Lerner once called the “magnifico” type in American politics. As David Blankenhorn put it almost nine months before the November 2016 election,
We revere magnificoes because they entertain us with their grand pretensions and larger-than-life ways. They make big deals. They gamble daringly. They are permissive. They spend freely and consume lavishly. Viewing the builder’s code of self-restraint as dour and boring, magnificoes prefer to strut and swagger, brag and charm, display and self-promote. Often enough they are criminals or at least friendly with criminals—the mobsters who invented Las Vegas in the 1930s and 1940s . . . are possibly the most influential magnificoes in U.S. history.
All true, yet we feel in our bones that something about the present plague of conspiracy mongering is different and more dangerous than earlier ones. Social media enables the massive, rapid, and unfiltered flow of pulse-raising nonsense, and presages virtual mobs becoming actual ones. The populist moment has set many politicians, especially Republican ones, back on their heels, rendering them fearful followers (if not complicit cynics), rather than the truth-telling counselors of restraint they ought to be. Above all, we until recently had a President who served as a disseminator and cheerleader for many of these theories. The combination of these factors has produced something unique in American history: an insurrectionist assault on the Capitol itself, the seat and symbol of American liberal democratic ideals, with the President of the United States egging on the mob.
If all it took to set things aright was voting out a deranged President and his cowardly and craven supporters, this would be a short essay. It would already be pretty much over. But what if the actual origins of this conspiracy medusa lie deeper in the culture? Alas, they do.
An animal’s behavior is a combination of its genetic endowment and the environmental stimuli that touch it. All animals, humans included, have elemental needs and urges, and an animal’s purposes based on them define the ways in which its genetic predispositions collide with the world to engender their experience. From a neurophysiological perspective, it is clear that from even before birth a human child’s allotment of dendrites are mowed into neural pathways with each and every sound, sight, feeling, smell, and taste. The plasticity of development slows with age but it never stops, save when life itself comes to a stop.
Leaps from the individual organism to a social whole are both dangerous and inevitable. But it at least seems safe to ask whether changes in the cultural environment that surrounds all human societies can affect how mentalities or dominant frameworks form, function, and fade over time. Since so much of the human environment is man-made by dint, for example, of technological endeavor, we witness a kind of loop of self-actualization driving human history. As Erving Goffman put it in 1974 in Frame Analysis, “Society takes up and freezes into itself the conceptions we have of it.”
We are, in short, an autogenic species; our capacity for articulate speech, symbolization, and literacy makes us so. Humans have a creative spark. We are the animal world’s playwrights, able to hitch our dreams to our articulate, wide-awake imaginations. We pour scenarios bearing our hopes out into the future, and let reality temper our aspirations. If we were not so equipped, we would still be tromping about in forests, jungles, and savannahs as we were before animal husbandry and then agriculture arose.
We really can’t help ourselves in this. We can’t prevent the flow of what amounts to culture becoming an epigenetic driver of the species, even as the fabrication of our man-made environment accelerates beyond anticipation or control. The cybernetic revolution is not like previous major spasms of technological innovation. Even the steam revolution, which changed nearly every social domain during the Industrial Revolution, only substituted machine energy for human energy. The cybernetic revolution, however, seeks to substitute machine “thought” for human thought. Machines will never develop consciousness (unlike animals, they lack inherent purposes to motivate them), but we already see the many kinds of things they can do. They have given rise over the past half-century to a phenomenon I call “technovelty.”
Technovelty is a conjunction of technology and context that enables innovation in technique to pursue old objectives in new ways. One manifestation of the conjunction is clear: Thanks to hyperconnectivity—the massive network of people—cybernetic technology has enabled the disintermediation of many social institutions, with massive implications. Taking just a cursory glance at American politics through this prism, we have only to note President Trump’s rule by tweet, which bypasses the filters of professional journalism and the institutions of American government, to boot.
The Stability of Reality
This cyber-tsunami, as some have described it, is overthrowing tradition, habit, stability, and predictability. Among the myriad changes it has wrought is the evolution of disinformation, of which the newest technologic form is the deepfake. The deepfake, as well as earlier cybernetic contributions to the art of disinformation, has enabled the inversion of age-old practices. It has also changed the balance of offense and defense when it comes to protecting and uncovering sensitive information.
The deepfake is a product of cybernetic innovation, specifically Generative Adversarial Networks—GANs for short. But as a species of disinformation activity, the efficacy of GANs-enabled techniques depends on a context in which perceptions of reality have already been destabilized. Here we come to the rising popularity and impact of conspiracy theories. The distinction between matter-of-fact reality and fantasy has become blurred for many Americans. When we dream, our mythopoetical facilities supply metaphors for our wide-awake efforts at inferential and analogic reasoning. We would be impoverished if not wholly lost without them. But the flow can reverse: When conscious cognitive discipline lapses, our reasoning capacities can flow into an associational metaphorical mode. It is at the nexus of these cognitive modes that spectacle lives.
So what exactly, then, do I mean by “spectacle?” A spectacle is an attention-arresting display that depends on the target’s sudden perception of the improbable to achieve its intended effect. To work, it must evoke what some psychologists refer to as an “astounding complex.” Here we mean not special effects joined to narrative arcs that induce perceptions we know are fabricated, but rather the “wow, you don’t see that every day!” reaction that used to make circus freak shows so captivating. It’s the difference between, say, The X-Files on the one hand and Ripley’s Believe It or Not! or the History Channel’s Ancient Aliens on the other. The in-between nature of spectacle also defines the genre known as reality TV: The people are real but the situation is contrived, leaving us unsure of what is scripted in what we see on the screen and what is not. The evocation of uncertainty, not certainty one way or the other, is the payoff point.
Ripley’s Believe It or Not!, which dates from the pre-television era, has become impossibly quaint as a mode of excitement qua entertainment. Technovelty has overrun it. Today, astounding complexes are ubiquitous as “technical events” in screen-delivered fare in television and movies—most often in the form of rapid scene cuts that take our senses to places where our bodies can’t go. For example, in the real world we cannot watch Irma and Lester Dinkel dining at their table in Wichita, and a second later be watching their son Roybob robbing a convenience store in Bayonne. We can do so in television and the movies, and now on our broadband-connected smartphones. When we experience such cognitive astounding complexes, little squirts of endorphins still spit from our pituitary glands despite the fact that these kinds of experiences have become so common that we no longer even notice them.
We see more, and more rapid, scene cuts per minute in commercials than in regular programming because, although they cost marginally more to make, viewers who are made more alert during commercials through the multiplication of scene shifts are more likely to remember and hence to buy the product. So advertisers judge the added expense to be cost-effective. We may think ourselves immune to consumerist sirens, but, statistically at least, we are easily influenced moist robots.
Many people in advanced-wired technological environments now experience more mediated images than real ones. Data on the average waking hours that Americans spend sitting in front of screens are shocking and still rising. On college campuses, very large percentages of students are neurobiologically addicted to their phones, thereby shaping their brains in ways that appear to be busily undoing the “revolution in the brain”-circuitry created by generations of their deep-literate forebears.
My suspicion is that we denizens of technovelty-saturated cognitive environments have gotten so used to mediated events that reality, reasoning, and associational modes of cognitive processing have become blurred with fantasy at a preconscious level, particularly in younger people who have never experienced a pre-cyber infospheric reality. But older people, if they expose themselves to enough high-res fictive spectacle, can blur reality and fantasy, too. They demonstrably do so along lines that George Gerbner pioneered in his work on the “mean world syndrome”: People who watch a lot of shock-bar-seeking commercial mass-entertainment fare massively overestimate the amount of violence, adultery, pederasty, and miscellaneous crime in the real world.
Many Americans so regularly enjoy high-tech entertainment modalities that our habit of suspending critical thinking in order to immerse ourselves in entertainment venues appears to have created a shadow effect that migrates, without our conscious awareness, to other domains—very much including politics. We enjoy others screwing around with potentially all five ways of knowing truth—empirical, rational, introspective, memorial, and testimonial. The more improbable the success, the more the endorphins flow and the more we enjoy it.
Now, most assume that rock-bottom reality, what Alfred Schütz called the lebenswelt, will always be clearly distinct from any and all fictive adumbrations of it, at least to normal adults. But we rarely ask ourselves why we assume this. We assume it because, until fairly recently, typical adults’ perceptions were dominated by direct experiences with the world beyond our heads. For the first time in human history, the cyber-tsunami has thrown that premise into doubt. The artificial world plugged directly into our heads now aligns with technology’s tendency to produce a condition of continuous partial attention to multiple stimuli. Our brains are being rewired—literally, not just metaphorically—as they must be, in accordance with the stimulative environments in which we place ourselves. It cannot be any other way.
So I do not mean spectacle as relatively rare splotches of diversion in otherwise workaday lives; I mean spectacle as a novel cognitive way of life founded in the nexus between neurophysiology and the spectral shift in the culture abetted by the cyber-innovations of recent decades. As Neil Postman, Neal Gabler, and others have guessed, hi-tech entertainment and the values peculiar to entertainment have conquered reality and inculcated into the culture the value array most common to it. Those values have come to include traits of the anti-hero: brashness, arrogance, vulgarity, bullying, and selfishness among them, but also distrust, suspicion, zero-sum thinking, and—most relevant to the rise of conspiracy theories—default suspicion of the motives of all social authorities. Human beings are imitative, assembling personalities from those to whom they are most frequently exposed. If “cool” people display these values on television and in the movies, mesmerized fans will adopt them to one degree or another. It’s been said before, but it’s worth repeating: Life imitates art, even very bad and crass art.
The final performance of Ringling Bros. and Barnum & Bailey Circus took place at Nassau Veterans Memorial Coliseum on May 21, 2017, but the larger truth is that the circus never left town. On the contrary: We love and will pay handsomely for a constant diet of attention-arresting displays that depend on our sudden perception of the improbable to achieve their intended effect. Why do we pay? Because we need others for spectacle to happen. We can’t tickle ourselves or perform magic tricks on ourselves. Similarly, we can’t experience spectacle solipsistically. We may acknowledge others conspiring with us to produce spectacle for the fun of it, but increasingly we don’t bother for the ubiquity of the experience, so that scene cuts and much else work seamlessly in our entertainment streams. So does spectacle applied to politics when, for many people, the entire frame of the activity has been rendered into a zero-sum form of bloodsport.
Deepfakes depend on the opposite, but related, cognitive process. We are being astounded but don’t realize it. We can’t be taken in if we perceive accurately that others are engaged. We don’t want to be taken in; certainly we care that hidden others seek to manipulate and instrumentalize us. But if our glancing skills have atrophied to the point that no suspicion arises, we far more readily get fooled. The question then becomes: Are we now so habituated toward failing to notice that others are tampering with our cognitive normality for the sake of our fun (and their profit) that we have become desensitized to those, at home as well as abroad, intent on screwing us over (also for their profit)?
Given how thoroughly the norms of entertainment culture have spread into the “real” world, it seems clear that a deepfake has a greater chance of working on a larger percentage of a given population if the technological environment has rendered false displays seamless to our sensory experience. In other words, the very formula that enables us to be entertained is the same formula that helps others to fool us for nefarious purposes. We may have constructed the scaffolding of our own gullibility.
Fooling Others, Fooling Ourselves
The derangement of American politics in recent times is not, of course, a result of deepfakes saturating the national infosphere. We cannot blame our problems on foreigners, try as they may to exacerbate our distress. Alas, the truth is far more prosaic. Nevertheless it is important to understand how deepfakes work as a conjunction of technology and context, because the same mentality shift in the culture sheds light on how the rise of conspiracy theories have contributed to that derangement. It turns out that fooling others and inadvertently fooling ourselves share similar cognitive foundations: Both cause the perception of reality to collapse.
Let me illustrate by calling attention to an old cartoon show, “The Smurfs.” In every episode, when the dialogue came to a point where a ten-cent word any child could learn was appropriate, the writers inserted the word “smurf”—as a noun, verb, adjective, adverb, or whatever. “The Smurfs” thus systematically turned educational opportunity into a graveyard of dumb-downing indifference.
The contemporary penchant for conspiracy theories works in a roughly similar way. Whenever people not particularly well educated, usually through no fault of their own, try but fail to account for a vexatious social reality weighing down on them, they tend to substitute the vocabulary uppermost in their minds that seems apposite. That is often the vocabulary of fictional fare from mass-entertainment movies and television, or video-game images for the many who are into that. As a result, we have reached the point that increasingly large numbers of politically mobilized, but not necessarily deep-literate, Americans cannot even imagine an explanation for any actual political or social circumstance that is more complex than that of a television or movie script.
Unfortunately, these scripts—to be as widely accessible to audiences as possible—are aimed at modal twelve to thirteen year olds in terms of vocabulary and plot sophistication. Moreover, a dominant theme of them is that the government is the incubus of utter evil. We can track the growth of the meme just by noting the proliferation of the anonymous “they” in common speech.
So it is no accident, as they say, that Rudy Giuliani used the phrase “trial by combat” in his incitement speech of January 6—a phrase he claimed in his defense came directly from Game of Thrones, but it was a claim that, by the light of this analysis, damned rather than exonerated him. Nor was it an accident that he invoked My Cousin Vinny in his bizarre stream-of-consciousness remarks at the Four Seasons landscaping company in Philadelphia. Somewhere along the way, Hizzoner’s brainwaves got snatched by Hollywood scriptwriters and he hasn’t been the same since. As for Trump, someone who apparently has never read anything more penetrating than the back of a cereal box, conspiracies come very naturally.
When Trump’s supporters say that “he tells it like it is,” or that he knows how to connect to the average American, what they mean is that he’s a non-deep-literate person speaking to other non-deep-literate people using words and concepts that are inadequate to actually explain anything, but which rouse people’s passions and win their trust on an emotional level. In normal times, this sort of thing doesn’t matter on the level of national politics, for lack of a sufficiently large mobilized audience. In fraught, populist-accented times, it may work if left to ramble undeterred. Listen to this, and see if it rings a bell:
The mythical organization of society seems to be superseded by a rational organization. In . . . periods of relative stability and security, this rational organization is easily maintained. It seems safe against all attacks. But in politics the equipoise is never completely established. . . . We must be prepared for abrupt convulsions and eruptions. . . . In desperate situations men will always have recourse to desperate means—and our present day political myths have been such desperate means. . . . If modern man no longer believes in natural magic, he has by no means given up the belief in a sort of 'social magic.' If a collective wish is felt in its whole strength and intensity, people can easily be persuaded that it only needs the right man to satisfy it.
These words were written in the throes of a different crisis, one that played out on a different continent, and the words refer to other “right men” who sought to satisfy the emotional turbulence of ordinary people in parlous times. The nearly eighty years between then and now have not, however, changed the basic point.
Fascist mythologies were quintessential conspiracy theories that overtook whole societies and doused them with blood. QAnon, too, is more than a bunch of disconnected irritable mental tics. It is not just a garden-variety conspiracy theory about past and present, but includes a narrative arc, spun by a charismatic prophet, that predicts a climax of violence, epiphany, and salvation in the future. Given its particular North American Protestant inheritance, and notwithstanding the anti-Semitism it shares in common with its European forebears, it comes across to non-believers as a bizarre combination of the Book of Revelations, the Rapture of the Plymouth Brethren, and a combat-themed video game. But in today’s spectacle-suffused hall of mirrors masquerading as a culture, that does not detract from its appeal. On the contrary, QAnon’s appeal to group cohesion, enabled by a belief in the improbable (something common to all cults), also fills a dire need for companionship among many deeply lonely people.
The ascent of conspiracy theories to the very heights of American political reality, then, has a composite origin. A major technological piece has summoned forth a spectacle mentality from amid our affluence-enabled obsession with entertainment. It is a spectacle mentality that rides high in rough proportion to the erosion of deep literacy, in which technological trends also play a major role. It draws from the multi-sourced return of zero-sum thinking at variance with the bases of America’s Enlightenment-born institutional arrangements. It certainly thrives on both the ambient anxiety afflicting a society that has been hemorrhaging social trust for decades and the specific anxieties induced by the coronavirus pandemic.
In a sense, it seems like a perfectly awful storm, a confluence of bad luck and bad faith that all came together in a gale of wild howls after November 3. How else except by dint of the suzerainty of spectacle can we explain the fact that something like fifty million adult Americans—70 percent of Republican voters in the November 2020 election—believed, at least as of January 6, that Donald Trump actually won the election in a landslide despite the facts that: his heretofore loyal Attorney General William Barr pronounced Trump’s claim “bullshit” to his face and resigned; even Mitch McConnell admitted the evidence-free lie once the Georgia Senate runoff elections concluded; many dozens of Republican election officials in Georgia, Arizona, Pennsylvania, and other states pronounced the President’s claims baseless; and dozens of judges at every level, including Republicans and even those appointed by Trump himself, told the President’s lawyers the same?
It is because facts don’t matter in the mentality of spectacle, and everything about Donald Trump, even his hair and “tan,” reeked spectacle. Cults, religious and political alike, form in the fertile soil between fact and metaphor, between disciplined reasoning and associational roaming. It was not, then, incidental when Trump’s handlers and spinners repeatedly told us not to take him literally—that is to say, factually—so that we could never be sure in distinguishing his “real” pronouncements from the Ripley’s sort. Trump’s was the reality-TV presidency from the very start, but it was a presidency that never could have happened had not the culture already been deranged for the purpose. One may justifiably blame the clickbait, market-share-seeking commercial media for much of the Trump phenomenon, but in their partial defense, they were only giving most people what they manifestly wanted.
So now think again about what happened on January 6. It was suffused with expectations of spectacle, and spectacle in a way it certainly was, whatever else it also was. It was founded on lies, including a very Big Lie. And it followed the script of conspiracy theories from start to hideous end. Was it, then, really about just one man, one moment, one place? If only.
Adam Garfinkle is a member of the editorial board of American Purpose, the founding editor of The American Interest, and a distinguished fellow at the S. Rajaratnam School of International Studies at Nanyang Technological University.