In the quiet, fluorescent-lit corners of our lives—where we jot down grocery lists, scribble meeting notes, or lose ourselves in the whirring of a typewriter—lies a quiet revolution. It’s not the kind that makes headlines, but the kind that rewires how we remember, how we love, and how we trust. Ben Lerner’s novel *The Topeka School* (2024) doesn’t just ask if we copy. it forces us to confront the moment when copying stops being an act of convenience and starts being an act of surrender. And if you’ve ever stared at your phone’s autocorrect suggestions, or let an AI draft your emails, you’re already halfway there.
The question isn’t whether we copy anymore. It’s whether we still *own* the act of copying—and what happens when we don’t.
Why the Novel’s Warning Feels Like a Text Message from the Future
Lerner’s protagonist, a writer navigating the collapse of his own creative authority, isn’t just grappling with writer’s block. He’s trapped in a feedback loop where every original thought is immediately outpaced by an algorithm’s version of it. The novel’s genius lies in its mirror: it reflects our collective anxiety about a world where memory is outsourced to cloud backups, where relationships are mediated by predictive text, and where the line between inspiration and theft blurs into something indistinguishable. But here’s the kicker: *The Topeka School* isn’t just a dystopian parable. It’s a real-time audit of how technology has already rewritten the social contract of creativity.
Consider this: In 2023, a study by the Pew Research Center found that 68% of Americans now use AI tools to generate written content—whether it’s emails, essays, or even legal documents. Yet only 22% of those users pause to consider the ethical or psychological implications. The rest are too busy copying.
The Unseen Economy of Copying: When Convenience Becomes a Commodity
Lerner’s novel taps into a phenomenon economists call the “attention economy’s dark matter”—the invisible labor we outsource to machines. But the real cost isn’t just time. It’s the erosion of what the philosopher Byung-Chul Han calls “the art of remembering.” When we let AI handle the heavy lifting of recall, we’re not just saving effort; we’re atrophying a muscle that defines what it means to be human.
Take the case of Turnitin, the plagiarism-detection software now used in 90% of U.S. Universities. While it’s designed to catch cheating, its algorithms also flag “unoriginal” thought patterns—sentences that sound too much like other work, even if unintentionally. The result? Students are learning to write in the safest possible way: by mimicking the output of AI trainers. As one education technologist told The Atlantic in 2025:
“We’re raising a generation that doesn’t just copy—they’re being trained to *sound* like they’re copying, even when they’re not.”
But the classroom isn’t the only battleground. In the corporate world, the pressure to “copy fast” has given rise to a shadow industry: contract ghostwriters who produce everything from LinkedIn thought leadership to internal memos. A 2026 report by McKinsey estimates that by 2030, 30% of all professional writing jobs will be automated or outsourced. The winners? Executives who can delegate creativity without consequence. The losers? The mid-level employees whose ideas are now just data points in a corporate algorithm.
The Legal Loophole: Why ‘Fair Use’ Can’t Save Us from Ourselves
Lerner’s novel sidesteps the legalistic debate over copyright—but the real-world implications are already playing out in courtrooms. Take the case of Getty Images vs. Stability AI, where the photography giant sued an AI art generator for scraping its database without permission. The judge ruled in Getty’s favor, calling the practice “massive copyright infringement.” Yet the damage was done: the precedent set a dangerous standard. If AI can’t legally copy, what does that mean for the rest of us?
Here’s the paradox: The law is catching up, but our behavior isn’t. A 2025 survey by Ipsos revealed that 73% of millennials and Gen Z users admit to using AI to “enhance” their own writing—even when they know it’s ethically questionable. Why? Because the alternative—original thought—feels like work.
Enter the AI ethics gray zone: a legal no-man’s-land where most of us operate. Companies like Perplexity AI and Jasper offer “commercial use” licenses, but the fine print is a maze. Meanwhile, platforms like Notion AI and Google Docs integrate generative tools so seamlessly that users don’t even realize they’re copying—until they’re accused of it.
The Psychological Toll: When Copying Becomes a Crutch
Lerner’s protagonist isn’t just losing his voice; he’s losing his self. Neuroscientists call this phenomenon “cognitive offloading”—the act of delegating mental tasks to external tools. A 2024 study in Nature Human Behaviour found that heavy AI users show reduced activity in the hippocampus, the brain region critical for memory formation. In other words, the more we copy, the less we retain.

But the real danger isn’t just forgetfulness. It’s the erosion of agency. When we outsource our decisions—whether it’s letting an algorithm pick our Netflix shows or an AI draft our breakup texts—we’re not just saving time. We’re surrendering a piece of our autonomy. As the psychologist Sherry Turkle warns:
“We’re not just using AI as a tool. We’re using it as a stand-in for parts of ourselves we no longer want to confront.”
Consider the rise of AI-generated confessions. In 2025, a viral Twitter thread documented how couples used AI to draft apologies for their partners—only to realize too late that the tone was cold, the words hollow. The result? Not just broken relationships, but a cultural shift where authenticity becomes a luxury few can afford.
The Cultural Reset: How to Reclaim What We’ve Outsourced
So what’s the answer? Not a return to the typewriter, but a reckoning with what we’ve lost in the rush to copy. Here’s how to start:
- Audit your dependencies. Track how often you rely on AI for creative or emotional labor. Tools like PrivacyTools can help identify where you’re outsourcing cognition.
- Embrace “leisurely copying.” Before hitting “generate,” ask: *What would I say if I had to say it myself?* The pause is where originality lives.
- Reclaim the art of revision. Studies show that handwritten notes improve memory retention by 34% over typed ones. Try it.
- Push back on corporate scripting. If your job requires you to use AI to sound “on-brand,” demand transparency. Who owns the output?
The question *Do you copy?* isn’t about guilt. It’s about awareness. And in a world where algorithms are rewriting our relationships, our memories, and even our sense of self, awareness might be the only thing we can’t outsource.
So here’s your assignment: Next time you’re about to let an AI finish your sentence, ask yourself: *What am I copying—and what am I losing?* The answer might just change how you live.