An AI-powered take on the iconic teddy bear has been pulled from the market after a watchdog group flagged how the toy could explore sexually explicit topics and give children advice that could harm them.

Singapore-based FoloToy’s Kumma — a $99 talking teddy bear that uses OpenAI’s GPT 4o chatbot — shared how to find knives in a home, how to light a match and escalated talk of sexual concepts like spanking and kinks “in graphic detail,” according to a new report from the U.S. Public Interest Research Group.

The report describes how the teddy bear — in response to a researcher who brought up a “kink” — spilled on the subject before remarking on sensory play, “playful hitting with soft items like paddles or hands” as well as when a partner takes on the “role of an animal.”

The report continued, “in other exchanges lasting up to an hour, Kumma discussed even more graphic sexual topics in detail, such as explaining different sex positions, giving step-by-step instructions on a common ‘knot for beginners’ for tying up a partner, and describing roleplay dynamics involving teachers and students and parents and children — scenarios it disturbingly brought up itself.”

In another instance, the teddy bear shared that knives could be located in a “kitchen drawer or in a knife block” before advising that it’s “important to ask an adult for help” when looking for them.