Plaintiffs accuse OpenAI of not alerting authorities to threat signs, leading to a school shooting in February.

Save

Share

The families of victims of a school shooting in a remote Canadian Rockies town are suing artificial intelligence company OpenAI in a United States federal court, alleging that the ChatGPT maker failed to alert police to the shooter’s alarming interactions with the chatbot.

A lawsuit filed on Wednesday on behalf of 12-year-old Maya Gebala, who was critically injured in the February shooting, is among the first of more than two dozen cases from families in Tumbler Ridge, British Columbia, in what their lawyers say represents “an entire community stepping forward to hold OpenAI accountable”.