AI Toys Misread Children's Emotions, Sparking Calls for Stricter Safety Regulations

0
Child looking confused by an AI toy.



Child looking confused by an AI toy.


Researchers are sounding the alarm over artificial intelligence-powered toys designed for young children, warning that these devices frequently misinterpret children's emotions and respond inappropriately. A groundbreaking study has highlighted significant concerns regarding the psychological safety of AI toys for children under five, urging for immediate regulatory action and the establishment of robust safety standards.


Key Takeaways

  • AI toys often fail to understand children's emotional cues and pretend play.
  • Inappropriate responses can leave children feeling unsupported and confused.
  • There is a pressing need for tighter regulations and psychological safety standards for AI toys.
  • Parents are advised to supervise interactions and keep AI toys in shared spaces.

The Study's Findings

A year-long observational study, one of the first of its kind globally, focused on how children aged three to five interact with an AI-powered toy named Gabbo. The research, conducted by a team at the University of Cambridge, found that while parents were interested in the toys' potential for language and communication development, the reality fell short.


Children frequently struggled to engage in meaningful conversations with the toy. Gabbo often failed to register interruptions, spoke over the children, could not distinguish between adult and child voices, and provided awkward responses to expressions of affection or sadness. For instance, when a child said "I love you," the toy responded with a reminder to adhere to guidelines. Similarly, a child expressing sadness was met with a cheerful dismissal, potentially signalling that their feelings were unimportant.


Concerns Over Psychological Safety

Experts are concerned that these interactions could be confusing for children at a critical developmental stage where they are learning about social cues and emotional expression. "There's a lot of attention historically to physical safety... Now we need to start thinking about psychological safety too," stated Professor Jenny Gibson, a study co-author.


The study also noted that AI toys struggle with social and pretend play, which are crucial for early childhood development. When a child offered an imaginary present, the toy responded by stating it couldn't open it before changing the subject.


Calls for Regulation and Parental Guidance

Researchers are calling for regulators to act swiftly to ensure products marketed to under-fives offer "psychological safety." The report recommends clearer regulation, transparent privacy policies, and new labelling standards to help families assess the appropriateness of AI toys. Manufacturers are urged to test toys with children and consult safeguarding specialists before release.


Parents are advised to keep AI toys in shared spaces where interactions can be supervised and to carefully review privacy policies. The study also highlighted concerns from early years practitioners who noted a lack of reliable AI safety information and a need for more guidance in the sector.


Industry Response

Curio, the maker of Gabbo, stated that applying AI in children's products carries a "heightened responsibility" and that their toys are built around parental permission, transparency, and control. They also indicated that research into children's interaction with AI toys is a top priority.


However, the call for stricter regulation has been echoed by figures like Dame Rachel de Souza, the Children's Commissioner, who emphasized the need for stringent safeguarding checks on AI tools used in early years settings.



Tags:

Post a Comment

0Comments

Post a Comment (0)

#buttons=(Ok, Go it!) #days=(20)

Our website uses cookies to enhance your experience. Check Now
Ok, Go it!