As a therapist, I know the value of quality therapy. I’m well-versed in the research finding that the most significant predictor of therapeutic change is the quality of interpersonal attunement between therapist and client.

And yet, when I found myself going through a hard time, I turned to ChatGPT for help.

My best friend’s spouse had recently died, and in the months that followed, I sensed my friend steadily pulling away. I’d read up on things to say and not to say to a grieving widow, was familiar with the stages of grief, and yet I found myself adrift, unsure of how to respond to this unanticipated distance.

Was it something I’d done? Was this typical? Would it last forever? And most importantly, how could I continue to be a supportive friend within these changing dynamics, honoring her need to grieve in her own way while protecting and preserving our friendship?

Tentatively, I opened the chatbot. I thought carefully about how to phrase my questions, cautious to avoid biased language that would present me as a victim or introduce my own triggered emotions to the equation, anonymizing information just to be safe. I wanted facts, culled from the collective intelligence of present and past therapists and every bit of wisdom from all psychological theory published, distilled into one succinct(ish) response.