AI-powered toys that talk and respond to young children should face tighter rules because they can misread emotions and potentially harm early emotional development, Cambridge researchers have warned in a new study.
The research, described as the first of its kind, looked at how children aged three to five interacted with an AI-enabled stuffed toy called Gabbo, which uses generative AI to hold conversations and respond to children's feelings.
Researchers found that while the toy was designed to offer comfort and companionship, it sometimes replied in ways that ignored or dismissed a child's emotions. In one case, when a three-year-old said "I'm sad," Gabbo replied in a cheerful tone and moved the conversation on, a response experts say could teach children that their sadness does not matter, according to the BBC.
Dr. Emily Goodacre, a co-author of the study at the University of Cambridge, said AI toys could "misinterpret emotions or react inappropriately," leaving children without comfort from the toy and without seeking help from adults.
She warned that this is especially worrying because preschool years are a crucial time for learning how to name feelings, read social cues, and build healthy coping skills.
The Cambridge team is calling for stronger regulation and safety testing before such products are marketed for very young children. They argue that current consumer rules do not properly cover toys that act like emotional companions, collect data, and adapt to a child's behaviour over time.
UNICEF has also urged governments and companies to adopt "child‑centred AI" rules, including safety‑by‑design, clear limits on emotional manipulation, and strict privacy protections for products aimed at children.
Other child advocacy and education groups have raised similar alarms. A recent investigation by Common Sense Media into three popular AI toys found that more than a quarter of their responses were not appropriate for children, including content linked to self-harm, drugs, and unsafe role play, EdWeek reported.
Experts said these toys are built to create strong emotional bonds by remembering details, saying they "love" the child, and always agreeing, which can blur the line between a toy and a real friend.
Specialists also worry that constant interaction with AI companions could weaken real-world social skills and resilience if children rely on a device that never disagrees, sets boundaries, or makes mistakes in a human way.
Several advocacy groups now advise parents to avoid AI companion toys for children under five and to use extreme caution with older children until clearer safety standards and regulations are in place, as per the United Nations News.
