Am I Cheating? Writing with AI and the Question of Authenticity

By Anthea Roberts and Miranda Forsyth

Imagine sitting down to write an essay, your fingers poised over the keyboard, ready to summon words from your mind onto the blank screen. But instead, you prompt an AI to help draft it. Is that cheating? The feeling gnaws at you, even if the AI produces exactly what you wanted—maybe especially if it does. The uneasy question—"Am I cheating?"—lingers in the background, stemming from a sense of guilt or doubt about whether using AI undermines personal effort and authenticity.

Cheating is usually about passing off someone else's work as your own. However, if you say upfront that you’re using an AI, are you really cheating in the classic sense? You’re not pretending, so perhaps it’s not deceitful. Then, is it about having an unfair advantage compared to others? But what if others have access to the same tools? The whole question becomes murky. Maybe the issue isn’t about cheating at all but about something deeper—about what it means to learn, grow, and develop as a writer.

If you use an LLM to write, are you missing out on the learning process that comes from struggling through a draft on your own? Are you falling into the illusion of mastery about a subject without truly internalizing the lessons? Learning often comes from the challenge, the struggle, and the mistakes made along the way. There’s a difference between producing a polished draft with AI and deeply understanding how that draft came to be. After all, there's something about wrestling with each sentence, about working through a tangled idea until it’s clear, that teaches you more than just how to write—it teaches you how to think. If the AI does the hard work, do you miss out on that vital struggle? And what about your own style? If the AI drafts most of the text, are you really developing a voice that’s distinctly yours?

The emotional aspect of using AI for creative work is complex. There can be a sense of guilt, as if relying on AI means you are somehow less authentic or deserving of credit. This guilt is tied to the deep connection we feel to our creative output—our words, ideas, and the effort we put into crafting them. The fear of imposter syndrome looms large, making us question whether we truly "earned" the result if an AI was involved in the process.

Maybe some of these concerns are rooted in an outdated view of technology. Throughout history, technological advancements have always sparked similar concerns about the way they change our ability to produce. When the typewriter was introduced, people worried it would diminish the artistry of writing by making it too easy. When calculators became commonplace in schools, there was fear that students would lose the ability to do basic arithmetic. And yet, these tools were eventually accepted, and we adapted our understanding of what it meant to be skilled. The current scepticism about AI is part of this ongoing story of adapting to new tools and evolving our definitions of creativity and mastery.

Tools have always augmented human effort—the typewriter didn’t make writing any less an act of creation, nor did the calculator make arithmetic less rigorous. However, AI differs from these past tools in the level of creative involvement it offers; it doesn’t just assist us but it can actively generate content, blurring the lines between tool and co-creator in ways we haven't faced before. Instead of merely augmenting what we already do, AI can replace some parts of the work. But this also allows us to focus on higher-level creative decisions--a shift that requires us to develop new skills, like knowing how to guide the AI, how to edit its drafts to make them sharper, more insightful, and unmistakably our own.

Different communities hold different perspectives on the use of AI in creative work. For some, it might seem like an innovative extension of creativity—a new collaborator that offers possibilities beyond human limitations. For others, it represents a fundamental shift that feels threatening, as if the personal connection between the writer and the words is being severed. Schools and universities are also struggling to cope with what Ethan Mollick calls the 'homework apocalypse,' where everyone is using AI or "cheating" all the time. Banning the use of AI feels a bit like trying to hold back the tide, but attempting to surf the wave also feels dangerous and uncertain for these institutions.

The "am I cheating?" feeling is real, a reflection of our discomfort with changing norms. Yet maybe it’s the wrong framing for understanding this moment. The real challenge isn’t about whether we’re cheating—it’s about learning to leverage these new tools without losing ourselves in the process. It’s about finding ways to make the AI a partner rather than a ghostwriter, a collaborator that expands what’s possible while keeping us firmly at the helm of our own creative ship. And it is about updating social norms about what it means to be a writer and a creator in an age of augmented intelligence—where we acknowledge both the challenges and opportunities that AI presents. As schools, universities, and society at large grapple with LLMs and the shifting landscape of creativity, learning and production, we must find new ways to define authenticity, mastery, and the value of human effort.

Previous
Previous

The Chef and the Overeager Sous-Chef: Learning to Co-Create with LLMs

Next
Next

Directors, Coaches, and Editors: The Human Role in the Age of AI