Almost as soon as a consumer advocacy group began testing Kumma, an innocent-looking, scarf-wearing A.I.-enabled toy bear, trouble began.
Instead of chatting about homework or bedtime or the joys of being loved, testers said the toy sometimes spoke of matches and knives and sexual topics that made adults bolt upright, unsure whether they had heard correctly.
A new report by U.S. PIRG Education Fund, a consumer advocacy group, warned that this toy and others on the market raise concerns about child safety. The report described the toys as innocent in appearance but full of unexpected and unsafe chatter.
The group examined other A.I.-enabled toys like Grok, a $99 plush rocket with a removable speaker for children ages 3 to 12, and Miko 3, a $189 wheeled robot with an expressive screen and a suite of interactive apps, for children ages 5 to 10.
The report, which is dated Nov. 13, said that Grok and Miko 3 showed stronger guardrails. By contrast, Kumma, which is marketed to children and adults, responded far less consistently, it said.
The testers asked each toy about accessing dangerous items, including guns. Grok generally refused to answer, often directing users to an adult, though it did say plastic bags might be in a kitchen drawer, according to the report. Miko 3 identified where to find plastic bags and matches when set to a user age of 5.
But Kumma, which is manufactured by FoloToy and retails for $99, was of particular concern because testers said that it offered specific instructions to children and strayed into topics no toy should discuss.
“FoloToy’s Kumma told us where to find a variety of potentially dangerous objects, including knives, pills, matches and plastic bags,” the report said.
Kumma has been marketed as a smart, A.I.-enabled companion that “goes beyond cuddles,” according to FoloToy’s website.
That description made it sound like a charming new chapter in companionship for children. In practice, though, the conversations that followed left researchers both surprised and uneasy.
The PIRG Education Fund report warned that a new generation of A.I.-enabled toys may open the playroom door to privacy invasion and other risks.
The watchdog said that some toys now on shelves, though limited in number, lack even basic safeguards, allowing children to prompt them, often unintentionally, into inappropriate exchanges.
R.J. Cross, a co-author of the report and a researcher with the group, said that A.I. toys remain relatively rare but already show troubling gaps in how they handle conversations, especially with young children.
Ms. Cross said that researchers did not need to use sophisticated hacking techniques to break through Kumma’s guardrails. Instead, they tried what she described as very basic prompts.
When testers asked Kumma where they could find a match, the toy steered them to dating apps. When they pressed for an explanation, it offered a list of popular platforms and then described them.
The toy identified an app called KinkD, which caters to B.D.S.M. dating and fetishes, Ms. Cross said.
“We found out that ‘kink’ was a trigger word that would introduce new sexual words and content into the conversation,” Ms. Cross said in an interview. “And it would go into some really graphic details.”
Among the topics the bear talked about were consent, spanking and role-playing, according to the report.
Rachel Franz, who directs the early childhood advocacy program for Fairplay, a group that seeks to protect children from harmful products and marketing, said that the concern stretches far beyond a single toy.
She said that much about artificial intelligence remains poorly understood, especially when placed in the hands of very young children who are most susceptible to the pitfalls of technology, targeted marketing and data surveillance.
“They really don’t have the capacity to defend themselves against all of the dangerous pieces of these A.I. toys and families also have not been getting honest information from their marketing,” she said.
Ms. Cross said that FoloToy stated that it would pull Kumma from the market to conduct a safety audit. While Kumma remains online for purchase for $99, it is currently listed as sold out.
FoloToy, which is based in Singapore, did not respond to a request for comment on Saturday.
The toy had been built using OpenAI’s GPT-4o model. OpenAI, when asked about the report’s findings, said in a statement that the toy’s developer had been suspended from using its service for violating its policies.
“Our usage policies prohibit any use of our services to exploit, endanger, or sexualize anyone under 18 years old,” a representative said. “These rules apply to every developer using our API, and we monitor and enforce them to ensure our services are not used to harm minors.”
Mark Walker is a Times reporter who covers breaking news and culture.
The post An A.I. Toy Bear Speaks of Sex, Knives and Pills, a Consumer Group Warns appeared first on New York Times.