top of page
Search

AI can help. It’s still not therapy.

by Gianluca Campanella - Year 3 Homa Trainee


NCPS Magazine - AI & Therapy
You can also read Gianluca's article in the NCPS magazine. Just click on the image to go to the online magazine...

I work in a field adjacent to AI, and I am training in humanistic psychotherapy.

This places me in a somewhat unusual position. I am familiar with how contemporary AI

systems are built, what they can and cannot do, and how quickly they are evolving.

At the same time, I am learning what therapeutic work looks like when it is practised

slowly, relationally, and with care.


From this vantage point, much of the current discussion about AI in therapy seems slightly off-target.

The debate often centres on whether AI will replace therapists, or

whether AI can ‘do therapy’. This framing assumes agreement about what therapy is,

and treats AI as a stable, clearly defined entity. Neither assumption holds.


From where I stand, therapy is not primarily a technique or a service. It is a particular kind of relationship between humans. It is intentional, asymmetrical, and ethically bound. It unfolds over time and involves real risk. It is not simply a space where

someone feels understood or receives helpful reflections, but a relationship in which

presence, rupture, and repair all matter.


Crucially, this relationship is embodied. Much of what happens in therapy does not

occur at the level of explicit content, but through tone, timing, posture, and nervous

system state. Humans regulate and dysregulate one another continuously, often

without noticing it. Therapy unfolds through this embodied capacity rather than

bypassing it.


So: if therapy is an embodied relationship between humans, what is happening when one participant has no body?


Some accounts in neuroscience and philosophy suggest that meaning and

consciousness are grounded in bodily sensation rather than abstract cognition. If this is the case, embodiment is not an optional extra; it is the ground of emotional

significance itself. AI systems do not become overwhelmed, soothed, startled, or

settled. Any sense of regulation remains one-sided. There is no shared physiological

field; the interaction remains at the level of representation.


Language does not bridge this gap. Even when therapy is primarily verbal, nervous

system states are encoded in voice, tempo, hesitation, and silence. We register

safety or threat before we understand words. AI can reproduce linguistic form and

simulate warmth, but it does not participate in the bodily processes that give those

signals their force.


Temporality matters too. Therapy is not a sequence of good responses. It has

pacing. It includes weeks where nothing resolves and pauses that feel unproductive.

Often, that slowness is not a flaw. It is the container. Therapy allows digestion to take

time. Meaning ripens when it is not forced. AI is optimised for immediacy, and speed

is not neutral in this domain.


Moreover, therapy involves power, and that power runs both ways. Therapist and

client can each fail to meet the other where they are. When that happens, something

real is at stake. The rupture matters because the relationship matters. Repair, when

it comes, is not the correction of an error but a deepening of trust. AI cannot risk

anything: it cannot be let down, and it cannot let someone down in a way that costs.

Finally, therapists bring history, blind spots, preferences, and emotional reactions into

the room. Therapy asks for these to be noticed and held responsibly, rather than

removed. A therapist can be changed by the encounter. Over time, they carry the

relationship. They remember. They may need supervision because something

stayed with them. This is not sentimentality, but part of witnessing. The client is not

processed; they are met. AI can store context, but it does not carry anyone in this

way.


None of this renders AI useless. On the contrary, it may be genuinely good at

functions that are adjacent to therapy but distinct from it. It can offer a space for

externalising inner dialogue, allowing someone to articulate what they are

experiencing before bringing it to a human relationship where the stakes feel higher.

It can assist with reflective journalling, prompting someone to notice patterns or track

their inner states over time.


These are valuable functions. For people without access to therapy, or between

sessions, or in moments of overwhelm, this kind of support may be stabilising. It may

help someone feel less alone, more organised, and more able to make use of human

relationships when they are available.


It may even be the case that some forms of AI support produce outcomes

comparable to therapy on certain measures. Even if that proves true, it would not

settle the question I am raising here. Outcomes and meaning are not the same thing,

even when they overlap.


I already experience us living in a world where people are increasingly mediated by systems rather than met. Relationships are becoming harder to initiate, sustain, and deepen. Loneliness is endemic. In this context, treating relationship as optional has

consequences.


If all forms of support are called therapy, therapy itself is quietly reshaped to fit what technology can deliver well: availability, consistency, and scale. What technology

cannot deliver is the slow and sometimes painful process of being known.


Once we lose sight of that distinction, we risk forgetting what we are losing

altogether. We will optimise for access and forget that some things matter precisely

because they are not always accessible. We will celebrate convenience and lose

track of why inconvenience might be part of the medicine. The friction of another

person, the wait, the uncertainty: these are not obstacles to therapy. They are

therapy.


AI can help. It may become an important companion to therapeutic work. What I

remain unconvinced by is the idea that this amounts to therapy itself. As I currently

understand it, therapy involves something that cannot be abstracted from bodies and

relationships without changing its nature.


My view may prove incomplete. Technology evolves, and psychotherapy has never

been as settled or timeless as it sometimes pretends to be. Still, if AI reshapes care

in this space, it will require us to articulate something we have long taken for

granted: that being met by another person is not a luxury, but the point.

 
 
 

Comments


bottom of page