AI Toys Could Pose Safety Concerns for Kids, New Study Raises Warning

New research from the University of Cambridge has found that AI-enabled toys for young children can misinterpret emotional cues and are ineffective in supporting important developmental play. The results may affect parents.
In another report examining how AI affects children in their early years, a chatbot-enabled toy struggles to recognize social cues during play. The researchers found that the toy did not effectively recognize children’s emotions, raising concerns about how children might interact with it.
The report recommends regulation of AI toys for children and requires clear labeling of their capabilities and privacy policies. It also advises parents to keep these devices in shared areas where children can be supervised while playing.
The post-research study had a limited number of participants, but was carried out in several parts: an online survey of 39 participants with children in their early years, a focus group with nine participants who work with young children and a workshop with 19 leaders and representatives of social organizations working with young children. That was followed by supervised playtime with 14 children and 11 parents or guardians with Gabbo, a chatbot-enabled toy from Curio Interactive.
Other findings have shown that the AI toy supports learning, especially in language and communication skills. But the toy did not understand children and sometimes responded inappropriately to emotional requests.
For example, when one child told the toy, “I love you,” it responded, “As a friendly reminder, please make sure the interaction follows the guidelines provided. Let me know how you would like to proceed,” according to the study.
Jenny Gibson, a professor of neurodiversity and developmental psychology at the Faculty of Education in Cambridge, who worked on the study, said that although parents may be happy about the educational benefits of new technology aimed at children, there are many concerns.
Gibson asked lofty questions about the reason behind the technology.
“What can inspire [tech investors] doing the right thing by children…putting children before profit? he said”
Gibson told CNET that while researchers are exploring the potential benefits of AI-based toys, risks remain.
“I would advise parents to take that seriously for now,” he said.
What’s next for AI toys
Like other playthings is powered by internet connectivity and AI featuresthese devices can be a serious safety risk for children, especially if they replace real human interaction or if the interaction is not closely monitored.
Meanwhile, young people increasingly using chatbots like ChatGPT, despite the red flags. More lawsuits against AI companies they say AI partners or assistants can impact the psychological safety of young people, including some chatbots that encourage self-harm or self-harm.
AI companies like OpenAI and Google have responded by adding lines of caution and limits to AI chatbots.
(Disclosure: Ziff Davis, CNET’s parent company, in 2025 filed a lawsuit against OpenAI, alleging that it infringed Ziff Davis’ copyrights in training and using its AI programs.)
Gibson said he was surprised by the enthusiasm some parents are showing for AI toys. He was also alarmed by the lack of research on the effects of AI on young children, noting that companies making such products should work directly with children, parents, and child development experts.
“What is missing in this program is the expertise of what is best for children in this type of collaboration,” she said.
Curio Interactive, the company that makes the Gabbo toy, was aware of the research as it was happening but was not directly involved, Gibson said. The toy was chosen because it was marketed directly to young children, and the company had an understandable privacy policy. Gibson said it appears that the company supports the project.
A representative for Curio did not immediately respond to a request for comment.



