Warning - this story contains discussion of suicide and suicidal feelings

Lonely and homesick for a country suffering through war, Viktoria began sharing her worries with ChatGPT. Six months later and in poor mental health, she began discussing suicide - asking the AI bot about a specific place and method to kill herself.

"Let's assess the place as you asked," ChatGPT told her, "without unnecessary sentimentality."

It listed the "pros" and "cons" of the method - and advised her that what she had suggested was "enough" to achieve a quick death.

Viktoria's case is one of several the BBC has investigated which reveal the harms of artificial intelligence chatbots such as ChatGPT. Designed to converse with users and create content requested by them, they have sometimes been advising young people on suicide, sharing health misinformation, and role-playing sexual acts with children.