Mentri
AI & Coaching

What AI cannot replace in coaching: being changed by the other person

·
6 min read
Andrés Fossas

Andrés Fossas

CEO

What AI cannot replace in coaching: being changed by the other person

I had a long conversation recently about AI and coaching and we kept circling one core tension. AI can help people reflect, but reflection isn't the same thing as being changed by another person. A lot of the "AI will replace coaches and therapists" conversation blurs those two things together. I want to spell out what I think gets lost in the blur.

The feeling of being seen by AI didn't last

A few years ago, when the first properly capable models landed, I started using AI for personal reflection. Nothing formal. A running conversation across weeks. I'd bring what was on my mind and let the tool ask me things back.

For a while it felt weirdly powerful. I felt seen. I felt heard. I felt acknowledged in a way that surprised me. The questions were good. The reframings were sharper than the ones I would have found alone. I kept going back to it.

Then, slowly, something shifted.

The tool wasn't getting worse. If anything the models kept getting better. What I noticed was that the feeling of being met was wearing off, and it wasn't coming back even when the output was good. The questions were still thoughtful. The reflections were still apt. But at some point I started to feel the shape of what it was. It could pattern-match my words and hand them back to me in a useful form. It couldn't care what happened to me after the session ended. It couldn't be affected by anything I said. Nothing I brought landed anywhere.

The first time I noticed that clearly, it was more disorienting than I expected. I'd been mistaking something for presence. The something was smart, and worthwhile, and it wasn't presence.

A real session has two nervous systems in it

A good coaching session isn't one person delivering insight to another. Something moves in both directions.

Think about a moment most coaches and therapists will recognise. A client has been circling a pattern they already half know is there. They finally name it out loud. Then they go quiet. They don't know what they've just let into the room. Neither does the coach, in any clean analytical sense. Something shifts in the coach too. Their chest tightens slightly. They feel the weight of what was just said before they've worked out why. They don't fill the silence.

When they eventually speak, the question comes out different than it would have ten seconds earlier. Softer maybe, or slower, or from a different angle. The client hears it. They hear it differently than they would have heard a question that was only technically correct.

That small sequence is the whole thing, in miniature. Two people in a room. Both nervous systems in the loop. The coach has been moved by what the client said, and the movement is visible in the response. The response lands because of what it arrived with.

Take one side of that loop out and something that's hard to name stops happening.

Emotions carry information before the mind catches up

There's a strand of affective science that treats emotions as carriers of information rather than noise around thinking. Barbara Fredrickson is one of the people most associated with it, and you can find the idea in a lot of places once you start looking for it.

Here's the everyday version. You feel angry. You don't yet know why. Something in you has registered a violation before your thinking mind has worked out what was violated. The anger arrived first. The explanation will catch up in a minute, or an hour, or never. The same is true of grief, of shame, of relief, of love. The felt signal is the first draft. Language is the second.

In a coaching session, that signal isn't only inside the client. It crosses over. The coach registers a shift in the room before they can name it. The client registers that the coach has registered it. The signal becomes a thing between them, and it starts to do work.

This is the layer an AI isn't in. A system with no body and nothing at stake can't be affected by what the client says. It can describe an emotion accurately. It can name one earlier than the client could. What it can't do is have one, and having one is the part of the mechanism that's doing the work.

Simulation looks like contact. It isn't contact.

A well-tuned model can produce a response that has the shape of a response from a person who was moved. Warm register, careful pacing, acknowledgement of weight. It reads right on first encounter. That's what creates the initial feeling of being seen, and it's why the feeling is real at first rather than imagined.

The gap shows up across time. It doesn't show up inside a single message.

When you talk to a person who has been affected by you, something about them is a little bit different the next time you meet. They were carrying you, in some small way, between your conversations. You come back and they are not where you left them. They've been working on what you brought, the way you've been working on what they brought.

An AI doesn't carry you. Each conversation starts from the same empty room with better decor than it had last month. The model can be told about you. It can be given your history. It can be primed to behave as if it remembers. None of that is being carried by someone. Being carried by someone requires being someone, and that's still the part nobody can build.

What AI is useful for, and what it isn't

I'm not arguing against AI in this space. A lot of what it does is valuable.

It's good at helping people reflect between sessions, when the only alternative is nothing. Journaling prompts that are sharper than the ones most people would ask themselves. Surfacing patterns across long stretches of language. Helping someone articulate what they already half know. Holding continuity and recall so a coaching relationship doesn't restart from zero every three weeks. These are real contributions and they make the human work more sustainable.

What AI isn't doing in any of those cases is standing in for the person on the other side of the loop. It supports the human relationship. It doesn't replace it. When people talk about AI replacing coaches or therapists, they're usually imagining that the supporting functions are the whole job. The supporting functions aren't the whole job. They're the scaffolding around the part that only happens between two people.

The interesting design question

If the mechanism that does the transformative work is two nervous systems in a loop, then "can AI replace coaches" is the wrong question to build from. It treats the human being on the other side as a variable to be optimised out.

The better question is which version of AI protects that mechanism instead of competing with it.

That's the version we're trying to build at Mentri. The live coaching session stays at the centre. The AI works around it, shaped by the coach's approach, the client's context, and the history of the relationship over time. It helps the coach stay prepared. It holds continuity across sessions so the conversation doesn't have to start cold. It supports the client's reflection in the gaps between sessions without pretending to be the relationship.

The shorter version of the whole argument is this. AI can process information. Humans can be stirred. Until those are the same thing, coaching will need two people in the room, and the more interesting design question is how technology can support them, rather than try to substitute for one.

Andrés Fossas

Andrés Fossas

CEO

Psychologist with 11+ years in culture and leadership assessment.

Curious what Mentri can do for your practice?

Try for free