You Can’t Trust Anything Anyone Writes
AI is creating a crisis of authenticity.
Paying Persuasion subscribers are invited to join our Intellectual Bootcamp, where we read classic articles and books that help make sense of the modern world. So far it’s been an enormous success! If you would like to join, please take advantage of a 20% discount on paid membership today. Check out our events page to learn what we’re reading next.
Like it or not, generative AI is becoming increasingly embedded in the process of seeking information, sharing ideas, and producing images. Want to know where to find pizza near you? Consult Gemini. Agonizing over the closing line of an email? Claude has 14 suggestions that strike just the right tone. Need a photorealistic image of a malnourished child building a boat out of popsicle sticks? ChatGPT thought you’d never ask—but will gladly oblige!
With tools this multipurpose, it’s no surprise our workplaces expect us to at least put up a convincing facade that we’re embracing them and making massive productivity gains. Or that our social media feeds are flooded with low-quality AI content generated by people desperate for clicks and views.
But I’m not sold on the narrative that generative AI marks the democratization of communication, a new creative renaissance, or even a particularly good thing for productivity properly defined. As Neil Postman put it almost thirty years ago: “The question, ‘What will a new technology do?’ is no more important than the question, ‘What will a new technology undo?’” And given that AI is frequently pitched as a replacement for any human speech mediated through a screen, it stands to undo quite a lot. What’s at stake is nothing less than authenticity itself.
Authentic expression helps us understand truth. Not objective, capital-“T” truth, but subjective truth—the kind that reflects a person’s interiority. Generated by systems lacking consciousness or lived experience, AI content, by definition, cannot be authentic.
Why should we care? Because authenticity is foundational to trust, the thread that ties human relationships together and allows us to work toward shared goals. Without the assumption of authenticity in communication, the small amount of trust that remains in our institutions is at risk of unraveling further, making everyday experience more clinical, cynical, and fake.
Let’s say you work a 9 to 5 job. After successfully guiding a project to completion, your team receives a congratulatory email from your supervisor. If it weren’t for the stilted tone and excessive use of the word “delve” (a common AI tell), you’d still clock the message as AI-generated by the sloppy cut-and-paste job, which leaves a slime trail of font style inconsistencies.
But regardless of the content, knowing that your supervisor did not engage in the deliberative process of writing makes the message ring hollow: No matter how well-intentioned the messenger, it is not ultimately their message. It’s the seed of a thought filtered through an algorithm developed by a tech company. And you wouldn’t be wrong to feel demoralized about that.
More concerning, however, is the race to the bottom this could create. You might become suspicious of your coworkers, assuming they’re also outsourcing their expression to the bot. Maybe you’ll even decide to use AI for your own communicative tasks. After all, if no one else can be bothered to speak from the heart, you might as well save 10 minutes on emails and squeeze in one more YouTube video between projects. Before you know it, one message at a time, your workplace becomes less connected, less trusting, and less humane.
“If it seems to us that other people are playing fair and doing their share, we do, too. If not, not,” wrote social scientist Robert Putnam in Bowling Alone. Trust does not simply materialize out of thin air, he argued. It requires trustworthiness. Building on Putnam’s point, we might say that in a high-trust space, a well-adjusted person would presume authenticity unless given good reason not to. Unfortunately, knowing AI use is widespread, rarely disclosed, and often occurs without regard to context, it’s much harder to assume authenticity in online communication.
Creative fields shed light on what creeping inauthenticity looks like in the real world. In a testimonial for journalist Brian Merchant’s newsletter “Blood in the Machine,” costume construction artist Rachel E. Pollock shares how AI amped up the surreality of her job, leaving her to contend with wildly unrealistic projects. “Someone will share the AI generated costume ‘designs’ and they will be literally impossible to construct for an actual human with materials available in the actual world,” she said, giving the example of “gravity-defying materials on pornographically cartoon bodies.”
An anonymous copy writer, meanwhile, described what it feels like to be on the other side of the equation, pressured to use AI for processes that once felt gratifying. “Writing like this doesn’t feel like it used to,” they said. “I don’t do the hard work that makes me feel alive afterwards. It’s different, more clinical, and much less rewarding.”
Of course, inauthenticity existed long before generative AI came on the scene. Spaces from offices to church groups have always been susceptible to social posturing that hinders the formation of deep relationships, and technological changes have arguably amped up the incentives for inauthenticity. AI may simply be the logical conclusion of this trend. In an episode of the podcast “Your Undivided Attention,” political scientist Sonja Amadae discusses the idea of “gamification,” arguing that the embrace of AI-generated content is a predictable outcome in a society that uses language tactically instead of expressively. When people use words not to convey their authentic thoughts and feelings, but as “tokens” for achieving strategic ends, it makes sense that they will gravitate towards outsourcing communication entirely to a bot.
I recently finished watching “Mad Men.” The show is set in the 1960s, before communication was regularly mediated through computers, but its main character, Don Draper, is the perfect example of how corrosive inauthenticity is to the soul.
Through his role at an ad agency, Draper attempts to escape his troubled past by spinning impressive campaigns from threads of truth from his own life. Throughout the show, however, he is haunted by his failure to integrate the various aspects of his identity. Using his pain as raw material for his work is not constructive when that pain is decontextualized, sanitized, and generalized through the language of advertising. Not only does failing to resolve this tension make him a restless, unsettled person, but it also impacts the way he sees other people: not as fully realized human beings but as static symbols of sexuality, domesticity, or authority. This is often exposed in his romantic relationships, which tend to fall apart right around the time the other party starts showing signs of human complexity.
To be sure, there’s a place for less-than-complete authenticity, for strategy in communication, for self-editing or tailoring a message to an audience to accomplish practical rather than expressive ends. The line between artistry and artifice has always been a blurry one, and most interactions contain an inseparable mixture of both.
However, just because authenticity is difficult to quantify, that doesn’t mean we shouldn’t value it at all. The more our tactical objectives come at the expense of lowercase-“t” truth, the more disconnected we become from ourselves and the world around us.
Continuity between thoughts and words, words and actions, matters immensely, because language not only describes reality: It creates it. Whether you can get behind that claim because you appreciate humanity’s constructive capacities or because “in the beginning there was the word,” we all have reason to recognize the miraculous power of language—that essential human technology that allows us to reveal a bit of our inner world to others and get a glimpse of theirs in return.
Maybe those marveling at AI’s ascendant role in communication are missing the forest for the trees, praising algorithms capable of turning expression into content and ignoring the expression itself that makes such reduction possible. After all, human expression is worth marveling at. The seed of a thought, once planted, flowers into an idea, which feeds a conversation, which nourishes an action, until possibilities once inconceivable are playing out in the world, cultivating fertile ground for further discovery.
Forgetting this process means forgetting who we are.
Talia Barnes is a visual artist and communications professional interested in media, culture, and the creative process. You can find more of her work on Substack at Art Life Balance.
Follow Persuasion on X, Instagram, LinkedIn, and YouTube to keep up with our latest articles, podcasts, and events, as well as updates from excellent writers across our network.
And, to receive pieces like this in your inbox and support our work, subscribe below:







You cannot trust that almost any creative content isn't AI-generated. There are some fingerprints on stuff that can help identify it as AI-generated, but I expect that to get cleaned up over time.