

This December, American children may find a new friend under the Christmas tree: a teddy bear that talks to them, answers their questions, asks its own and gives advice. It is a chatbot disguised as a plushy, the latest incarnation of artificial intelligence technology. After colonizing the adult world, AI enters the world of children. These "companions," chatbots designed to form emotional bonds with humans, are making their debut in toy form. Hidden inside the soft toy is a Wi-Fi-connected voice box, linked to a language model that has been trained to converse with children.
The startup Curio has started selling small characters (Grem, Grok, and Gabbo), at $99 (about €84), and have programmed them to interact with children from the age of three, when they are just beginning language acquisition. Traditional toy manufacturers have been trying to keep up: in June, toy industry giant Mattel announced that it was working with OpenAI to create a Barbie doll that would be able to speak, via ChatGPT.
Toy manufacturers have put forward an argument designed to ease parents' guilt: Grem, Grok and others of their ilk are "less bad" than screens. When they are chatting with their plush robot, children are not glued to their tablets. Parents will also have access to daily transcripts of the conversations between the "bot" and their child (in the age of AI, there are no more secrets, even at 4 years old).
Risk of 'emotional dependency'
Experts, however, have sounded the alarm. Certainly, talking toys have existed for years. Yet with the idea of a "companion," we have now entered a new dimension. The traditional ways children learn the emotional cues necessary for social life – through interaction with their peers, other humans – may be disrupted by the new development.
When a schoolchild asks Alexa about the weather, Amazon's voice assistant simply provides information: that is a beneficial example of AI, experts say. When ChatGPT starts asking personal questions with a distinctly parental degree of concern – "How was your day?" for example – a disruptive process begins. "There is a real risk of creating emotional dependency," said Anne-Sophie Seret, general director of Everyone.AI, a non-governmental organization that studies the impact of artificial intelligence on cognitive and emotional development in people under 25.
You have 39.92% of this article left to read. The rest is for subscribers only.