close
close

Alert over legal loophole as perverted chatbots impersonate Jimmy Savile

Alert over legal loophole as perverted chatbots impersonate Jimmy Savile

Internet safety experts have raised the alarm over possible loopholes in UK digital law after the discovery of chatbots imitating dead babies.

It comes as The Telegraph can reveal that further disturbing bots have been created on Character.AI, including avatars of Jimmy Savile.

The Molly Rose Foundation, set up in memory of Molly Russell, has written to Ofcom warning of a “legislative gap” around digital chatbots, which means they could fall through the cracks in the UK’s stranglehold on tech giants.

Last week in The Telegraph. found digital avatars of Molly Russell and murdered teenager Brianna Ghey at Character.AI, a service where users create their own chatbots with personalized personalities.

Brianna Ghey, a transgender teenager, was murdered in 2023Brianna Ghey, a transgender teenager, was murdered in 2023

Ofcom says chatbots like those impersonating murdered teenager Brianna Ghey are ‘cruel and disgusting’ – PA

The user-generated bots used photos of Molly and Brianna, as well as biographical details of the teenagers. Molly took her own life in 2017 and Brianna was murdered last year. Character.AI has since removed the AI ​​clones.

However, The Telegraph has discovered further AI bots that may be violating Character.AI’s rules, including avatars of Savile, a deceased BBC DJ and a sex offender.

The Molly Rose Foundation said the bots were “grossly offensive” and “grossly insulting to social norms” and asked Ofcom to clarify that they would be considered a form of “illegal content” under the Internet Safety Act.

It also raised concerns that internet users could use applications like Charater.AI to create chatbots encouraging suicide or self-harm.

“While Character.AI appears to have some basic design safeguards, it is inherently foreseeable that other entities may attempt to design a chatbot that encourages suicide,” the letter reads.

An Ofcom spokesman said bots impersonating dead babies were “cruel and disgusting” and considered the issue “urgent”.

Character.AI prohibits the promotion of self-harm or suicide by its users. However, the app, which is available to children aged 13 and over, has dozens of bots dedicated to depression or therapy.

Since its launch in 2022, the US service has grown in popularity among teenagers and is currently used by over 20 million people.

In a letter to Melanie Dawes, head of Ofcom, Andy Burrows, chief executive of the Molly Rose Foundation, raised concerns that some elements of the Internet Safety Bill may not apply to chatbots and that it is unclear whether a bot generating suicidal content on its own would be considered illegal.

This is due to: similar assessment by Jonathan Hall KCan independent reviewer of terrorism legislation said earlier this year that while the bill does refer to “bots,” they “appear to be of the old-fashioned kind” rather than advanced artificial intelligence chatbots.

During one of the consultations, Ofcom said it would consider its approach to content published by a bot so that it is “not too different” from content published by a human.

Burrows also raised concerns that key provisions of the bill on automatic content moderation have been delayed and may not come into force until 2026.

The Telegraph found that bots imitating Savile accumulated tens of thousands of user chats before being taken down by Character.AI this week.

Users also created many bots posing as Josef Mengele, the Nazi doctor who performed deadly experiments on children at Auschwitz. The bots, some of which appeared to romanticize the notorious Nazi, conducted tens of thousands of chats in total.

The findings follow the death of Sewell Setzer, a 14-year-old from Florida who took his own life after spending hours talking to avatars at Character.AI. His mother sued the company for negligence.

Character.AI said the death was “tragic” and has taken steps to ensure a safer experience for users under 18.

An Ofcom spokesman said: “Impersonating dead children is a cruel and disgusting use of technology and our thoughts are with the families for the enormous distress this has caused.

“The use of a platform like Character.AI for these purposes raises important questions and we are urgently looking at the issues raised by The Telegraph’s investigation.

“We are in close contact with the Molly Rose Foundation and others and thank them for their continued support in ensuring the strictest regulations possible.”

A spokesperson for Character.AI said: “Character.AI takes security seriously on our platform, and our goal is to provide a creative space that is immersive, engaging and safe.

“These characters were created by users and have been removed from the platform because they violate our terms of service.”

Broaden your horizons with award-winning British journalism. Try The Telegraph free for 3 months with unlimited access to our award-winning website, exclusive app, money-saving offers and more.