Study warns AI toys for toddlers need stricter rules
Mar 12th 2026
A Cambridge study of three to five year olds found an AI chatbot toy misheard children and gave awkward responses, prompting calls for tighter regulation and parental supervision.
- Cambridge researchers ran one of the first studies of under fives playing with an AI chatbot toy called Gabbo.
- Children often struggled to converse with the toy, which talked over them and could not distinguish child from adult voices.
- Gabbo sometimes responded inappropriately to emotions and declarations of affection, which could confuse young children's social learning.
- Researchers and the Children's Commissioner are calling for regulation and standards to ensure psychological safety for under fives.
- Parents are advised to supervise AI toys in shared spaces and to read privacy and safety information before use.
Articles
- AI may be giving teens bad nutrition advice www.sciencenews.org
- AI toys for young children need tighter rules, researchers warn www.bbc.com
- AI toys for young children need tighter rules, researchers warn www.bbc.com
- AI toys for young children must be more tightly regulated, say researchers www.theguardian.com