February 25, 2026

Agents Don’t Sign 'Fine' : The Importance of Voice in the Age of AI

Industry News

The Bookseller earlier this week published an article highlighting how agents, such as Greene & Heaton, are changing their policies after an influx of AI-assisted submissions began flooding their inboxes.

And rightly so – because AI is everywhere right now.

It’s reshaping industries, accelerating workflows, and quietly embedding itself into the tools we use every day. In publishing, that can feel both exciting and unsettling as we adjust and adapt to this new AI-present world we’re facing. Authors are campaigning over IP concerns. Charts are being swamped by AI-generated content. Some writers are experimenting. Some are wary. Some are embracing while others are quietly using these tools and unsure whether they should admit it.

So how to navigate this new world? Let’s start here.

Although we could debate the moral standpoint of AI all day – and I shall leave that to philosophers more sophisticated than I – AI is not inherently evil. And it’s not inherently brilliant.

It’s a tool.

We’ve been using forms of ‘AI’ in writing for years: spellcheck, Grammarly, even poor old Clippy from Microsoft Word (RIP, little guy). Today’s tools are simply more sophisticated – and far more persuasive.

AI is fast. Tireless. Efficient. It’s excellent at catching surface-level issues.

But beyond those surface elements, it does not make a great editor.

The Rubber Duck Analogy

At a conference last year, I heard a wonderful analogy from publishing expert Jane Friedman.

She compared using AI in your writing process to the ‘rubber duck debugging’ technique in software engineering. The story goes that a manager grew tired of coders interrupting him with problems, only to solve those problems halfway through explaining them aloud. So he placed a rubber duck outside his office with a note that read:

“Only disturb me if you’ve explained your problem to the duck and still need help.”

AI can function like that rubber duck.

You explain your plot problem to the AI of your choice. It reflects your thinking back at you. Often, articulating the issue is enough to unlock the solution.

That’s valuable.

So if AI helps you feel less alone in the murky middle of a draft? If it prompts you to clarify your thinking? Fine. Use it consciously.

But if you’re hoping it will guide you all the way to a publishable manuscript, it will come up short.

What AI Cannot Do

AI does not understand your story.

It can apply generalised ‘rules’ of storytelling. It can compare your manuscript to thousands of patterns in its training data and suggest structure. It can generate phrasing. But it cannot feel the emotional journey of a character across 90,000 words. (Trust me, I’ve tried!)

It won’t notice that your killer twist is accidentally revealed on page ten.

It won’t sense that your protagonist sounds subtly different in each chapter.

It won’t detect that the tone shifts halfway through.

It won’t know when something heartfelt has tipped into cliché.

AI doesn’t ‘read’ in the human sense. It predicts the most statistically likely next word. It works on probability, not instinct. It does not have taste. It does not have emotional context. It doesn’t feel.

AI has no emotional stake in the story. It can imitate tone, but it cannot experience it. It knows what tends to be said based on data – but it cannot tell you why it matters.

And it certainly does not care about your reader.

Or your career.

In fact, AI is a chronic people-pleaser. It smooths. It polishes. It rounds off edges. It often replaces specificity with something more universally acceptable – because that’s what it is designed to do.

But in publishing, universally acceptable is not what gets writers signed.

Agents don’t sign “fine.”
They sign unforgettable.

And this doesn’t only apply to traditional publishing.

Readers don’t fall in love with technically correct prose. They fall in love with perspective. With rhythm. With the feeling that there is a real mind behind the page.

If you’re self-publishing, building a series, or writing directly for your audience, the same rule applies. Clean writing gets you through the door. Voice is what makes readers stay.

Voice has never been more important.

And increasingly, agents and publishers are begining to speak publicly about the change they’re seeing in submissions – technically clean manuscripts that feel strangely flattened. Polished, but emotionally indistinct.

That sameness is the warning sign.

Where AI Can Be Useful

There are areas where AI-style tools can help.

After you’ve completed your own line-level edit, using software like ProWritingAid to double-check for mechanical errors can be useful. Catching typos. Flagging repeated words. Identifying stylistic inconsistencies. As humans, we are bound to miss things – so this is where machine efficiency is helpful.

But here’s the crucial part: do not outsource your judgement.

These tools are wrong surprisingly often. They introduce changes that alter meaning. They flag things that aren’t errors. They suggest stylistic shifts that don’t suit your voice.

Why? Because they are still working on probability.

They do not understand what you’re trying to say. They are guessing what is most likely.

I once tested an AI-based analysis tool that confidently told me a character was underdeveloped and barely present in the novel.

The character was dead.

Their absence was the point.

Generative AI can spark ideas. But it frequently gets facts wrong. It misses nuance. It lacks context. It cannot evaluate the emotional architecture of a novel because it doesn’t feel that emotion

The Real Risk: Voice Erosion

If you are querying agents or preparing to publish, ask yourself this:

Imagine you’re at your book launch and a reader says,

“I loved that line. What were you thinking when you wrote it?”

If your answer is, “Well… I wasn’t. ChatGPT was,” something fundamental has fractured.

Would you be able to defend every sentence in your manuscript as something you consciously chose?

If the answer is no, then it’s time to think hard.

Your voice is what makes you human. It is as unique as a fingerprint.

Anything that subtly dilutes it – even in the name of polish – can work against you.

In a crowded thriller market, what separates one book from another is not grammatical perfection. It’s perspective. It’s rhythm. It’s emotional precision. It’s the specific way you see the world. It’s your take on the stories that we know and love.

AI trends toward the average.

And average does not build a career.

Why Human Editing Matters More, Not Less

Ironically, in a world saturated with AI-generated language, working with a human editor becomes more important — not less.

Editing is not about smoothing everything into technical correctness. It’s about deepening what’s already there.

It’s about instinct. Taste. Risk. Knowing when to follow the “rules” — and when to break them deliberately.

It’s about asking the right questions so that your voice becomes clearer, not quieter.

AI can suggest structure. It can mirror your thinking back at you. It can tidy.

But it cannot sit across from you and say:

“This is where the reader leans forward.”

“This is the moment your character becomes real.”

“This is braver than you realise.”

That requires human connection.

And that connection — between writer and editor, between writer and reader — is the thing I care about protecting most.

AI will never be human.

And your readers are.

Which is why, in many ways, the role of the editor becomes clearer in this moment – not smaller.

Editing has never really been about tidying sentences. Not at the level that shapes careers. It’s about protecting the emotional thread between writer and reader. It’s about noticing where your voice sharpens, where it falters, where you’re holding back, and where the story is braver than you realise.

A machine can smooth language. It can standardise structure. It can mirror patterns.

But it cannot sit with a manuscript and ask:

Is this the truest version of what you meant to say?

Is this where the reader leans forward?

Is this voice unmistakably yours?

That discernment is human. It’s instinctive. It’s relational.

And in an age where language can be generated instantly, that human line – between intention and impact, between writer and reader – matters more than ever.

My work has always sat on that line.

Not to polish you into something generic.

But to help you become more distinctly yourself.

Because stories don’t succeed because they are statistically correct.

They succeed because they resonate. Because it's through stories and narrative that we understand each other's experience.

And that experience is distinctly and uniquely human.

***

My Policy in Practice: Where I Stand

Because this conversation is evolving quickly, transparency matters.

Here’s how I approach AI in my own editorial work.

I do not use generative AI tools to write, rewrite, summarise, or alter client manuscripts. No client manuscript text or identifiable story content is uploaded to cloud-based generative AI services.

All editorial work – developmental feedback, editorial assessments, copyediting, proofreading – is completed personally by me.

Where automated tools are used, they are limited to offline software and applied only after a full manual review. These tools function as mechanical support – flagging potential surface-level issues – not as creative decision-makers.

Occasionally, structural reporting tools may support a developmental assessment. But they are supplementary data points. Human insight remains the foundation of all feedback.

On the administrative side of my business, I may use AI-supported tools to organise my own notes or plan content. Only my analytical observations are shared – never client manuscript text or identifiable narrative details.

In short: your story remains yours, and it is treated with respect.

When Authors Use AI

I ask clients to disclose AI use.

Not as a judgement. Not as a barrier. But because the industry is shifting.

Agents and publishers are increasingly asking whether AI tools were used in manuscript creation. Transparency protects you.

If you’ve used AI for brainstorming, accessibility support, or structural experimentation, that does not automatically prevent us from working together. What matters is whether the manuscript ultimately reflects your creative intent and voice.

Where I draw a firm line is here:

•       Manuscripts primarily or entirely generated by AI

•       Work commissioned to mimic another author’s voice

•       Projects that outsource creative authorship rather than support it

These situations are rare. And if there is ever uncertainty, I will always have a conversation before making a decision.

Because this is not about policing tools.

It’s about protecting craft.



*This blog has been adapted and updated from a presentation I gave to the Victoria Writers Network in November 2025.