Chatbot versions of teenagers Molly Russell and Brianna Gee were discovered on Character.ai, a platform that allows users to create digital versions of people.
Molly Russell took her life at the age of 14 after viewing suicide stories online, and in 2023 Brianna Gee, 16, was murdered by two teenagers.
The foundation set up in memory of Molly Russell said it was “sickening” and an “utterly reprehensible failure of moderation”.
The platform is already being sued in the US by the mother of a 14-year-old boy who committed suicide after becoming addicted to the Character.ai chatbot.
Character.ai told the BBC that it takes safety seriously and manages the avatars people create “actively and in response to user reports.”
“We have a dedicated Trust & Safety team who review reports and take action in accordance with our policies,” it added.
The company said it removed the user-generated chatbot after being alerted to it.
Andy Burrows, chief executive of the Molly Rose Foundation, said the creation of the bot was a “disgusting act that will cause further heartache to everyone who knew and loved Molly”.
“This clearly highlights why stronger regulation of both AI and user-generated platforms can’t happen soon,” he said.
Brianna Gee’s mother Esther Gee told the Telegraph, which first reported the story, that it was yet another example of how “manipulative and dangerous” the online world can be. .
A chatbot is a computer program that can simulate human conversation.
Recent rapid developments in artificial intelligence (AI) have made it more advanced and realistic, with more companies launching platforms that allow users to create digital “people” to interact with.
Character.ai, founded by former Google engineers Noam Shazeer and Daniel De Freitas, is one such platform.
The company has terms of use that prohibit the use of its platform to “impersonate any person or entity,” and the company’s Safety Center states that its core principles are “to prevent products from potentially harming users or others.” “Never generate a certain response.”
The company said it uses automated tools and user reports to identify usage that violates its rules, and is also building a “trust and safety” team.
However, the magazine notes that “there is currently no perfect AI” and that safety in AI is “an evolving area.”
Character.ai is currently being launched by Megan Garcia, a Florida woman whose 14-year-old son Sewell Setzer took his own life after becoming obsessed with an AI avatar inspired by Game of Thrones characters. It is the subject of a lawsuit.
Garcia’s son discussed ending his life with the chatbot, according to records of their chats in Garcia’s court filing.
In their final conversation, Setzer told the chatbot that he was “going home” and urged it to do so “as soon as possible.”
Shortly thereafter, he passed away.
Character.ai told CBS News that it has protections specifically focused on suicidal and self-harm behavior, and that it plans to introduce stricter safety features for under-18s “soon.”