State provision for psychological health services is lamentable. Until things improve, let’s not judge those who turn to an app for help

I

t’s a sunny afternoon in a Roman park and a peculiar, new-to-this-era kind of coming out is happening between me and my friend Clarissa. She has just asked me if I, like her and all of her other friends, use an AI therapist and I say yes.

Our mutual confession feels, at first, quite confusing. As a society, we still don’t know how confidential, or shareable, our AI therapist usage should be. It falls in a limbo between the intimacy of real psychotherapy and the material triviality of sharing skincare advice. That’s because, as much as our talk with a chatbot can be as private as one with a human, we’re still aware that its response is a digital product.

Yet it surprised me to hear that Clarissa’s therapist has a name: Sol. I wanted mine to be nameless: perhaps, not giving it a name is consistent with the main psychoanalytical rule – that is, to keep personal disclosure to a minimum, to protect the healing space of the so-called setting.